Room: Rossetti Room


Moneyball and the Science of Building Great Testing Teams

Wednesday, November 7, 2012: 9:45 AM - 10:45 AM

Moneyball is about baseball.  But it’s also about breaking down accepted preconceptions and finding new ways to look at individual skills and how they mesh as a team.  Sometimes the characteristics that we believe a team needs aren’t all that important in assessing and improving quality.  It’s also about people deceiving themselves, believing something to be true because they think they experienced it.  In fact, some of a team’s accepted practices may have less an impact on quality than we think.  This presentation examines how to use data to tell the right story about our success in shipping high quality applications.  It takes a look at some of our preconceptions about testing and individual skills.  It identifies characteristics necessary to building and running a high-performance testing team.  It applies the Moneyball approach to testing and quality to give teams the best bang for their buck in evaluating their own capabilities and requirements, and delivering the highest quality possible.

  • Understand the dynamic of you team and its impact on improving software quality.
  • Recognize your own preconceptions and biases in evaluating quality
  • Know when to rely on instinct for assessing quality and when to take the time to make a deliberate and informed decision

Defining the Value of GUI Test Automation

Wednesday, November 7, 2012: 11:00 AM - 12:30 PM

Today, we have a good understanding of the technical details for implementing GUI test automation. However, many test automation teams still struggle with understanding their mission. These teams find measuring their effectiveness and communicating their value to stakeholders an overwhelming challenge.

In his presentation, Yury will discuss how to help a test automation team learn to understand what their stakeholders really want, thus helping to define their real mission. Ideas for choosing metrics to measure effectiveness and communicate a test team’s real value to their stakeholders will be discussed.  Using this approach will help your test automation team become more successful and valued.

  • Understand that the value of test automation depends on a context.
  • Learn to correctly define test automation priorities based on the priorities of the development organization.
  • Explore different metrics that should be used to measure and report the value of test automation in different types of organizations.

Effectively Implementing Test Automation: Performing an Internal Assessment

Wednesday, November 7, 2012: 1:30 PM - 3:00 PM

Your organization has implemented software test automation and is not realizing the value it expected.  Or perhaps, it’s just starting to consider implementation and you wish to be proactive.  The scope and complexities of testing are increasing as new technologies and environments emerge, applications become more advanced, and users become more astute.  Including test automation can improve your testing process but you’re then faced with numerous considerations.  What are the primary objectives of the QA team?  What challenges should be expected?  Is our technical environment ready?  Does our team have the necessary skills?  How will we success?  Utilizing a software test automation assessment will address these questions and assist you in focusing your automation efforts. In this interactive workshop you will be guided through a software test automation assessment process and have the opportunity to assess your own organization.  A documented Software Test Automation Assessment template will be provided to all workshop attendees.

  • Explore critical test organization assessment areas
  • Learn key discovery questions to ask during an assessment process
  • Develop a realistic roadmap to software test automation success

Meta Data Based Test Automation

Wednesday, November 7, 2012: 3:30 PM - 4:30 PM

Most enterprises are struggling with functional test automation.  It is expensive, time consuming, and ineffective when test automation is approached in an ad hoc manner.  Organizations must consolidate and streamline their automation approach to address new technologies and complex IT application portfolios.

In this presentation, Raj will discuss a new and innovative functional test automation framework. This framework is based on a meta-data approach to assist in creating automation scripts quickly with high accuracy.  In this session you will gain insight into this framework and learn how to apply it to various enterprise application testing needs.  Tips will be shared for creating this framework within your own enterprise.

  • Discover a meta-data based framework for functional test automation
  • Learn how to apply this framework to various enterprise applications testing needs
  • Hear pointers for implementing this framework

Implementing QA Processes on Large-Scale Projects

Thursday, November 8, 2012: 9:45 AM - 10:45 AM

Working on a large, complex projects over a year in duration and made up of over 100 resources, requires special considerations when developing overall QA strategies and test plans. This presentation provides attendees with insights based on a $70 million PeopleSoft implementation with additional components including Business Intelligence Reporting and a Data Integration Hub. Tasked with third party validation through managing the test effort for the project, Shaun Bradshaw acted as QA Architect to provide the test strategy and planning for the implementation vendor’s customization and development efforts. Shaun discusses his real-world experience gained including how to prepare, communicate, and manage the QA effort for such a complex project. He also shares the obstacles and challenges he and his team had to overcome to deliver as smooth an implementation and go-live as possible.

  • Identifying key QA strategic activities on a large, complex project
  • Grasping test planning considerations necessary for component system testing and solution integration testing
  • Leveraging a proven method for efficient test scenario development, execution, and re-use

Lean Techniques for Managing Testing

Thursday, November 8, 2012: 11:00 AM - 12:30 PM

Lean techniques used in testing are mechanisms to scale testing processes while optimizing the testing value stream.  In today’s corporate environment lean and agile are used almost synonymously but they are not the same.  Lean focuses on systems thinking and process optimization while agile is focused on people.  Lean concentrates on the workflow through a testing process.  Tools like value chain mapping and kanban can be used to maximize the value of testing while enhancing overall team performance.  Combining lean and agile can optimize both the people equation and the process equation.  Join Tom to understand how a combination of both techniques applied correctly can benefit your own organization.

  • Explore how lean and agile can work together in a synergistic manner
  • Learn about the lean  tool kit that focuses on optimizing the entire process, including testing
  • Discover that flow is an important attribute and measure.

Model Based Effort Estimation for Software Testing

Thursday, November 8, 2012: 1:30 PM - 3:00 PM

Model Based Effort Estimation leverages the common principals of successful estimation techniques and provides a structured approach to producing estimates which leverage repetitive patterns in work items, within and between projects.  Additionally, this technique maximizes the ability of an organization to review estimates, ensure consistency across estimates, and update estimates over time, consequently, leading to greater accuracy earlier in the project. This workshop will cover an overview of estimation practices and historical adoption, the difference between “reasonable” and “accurate” and which one best fits your organization, and the characteristics of effective estimation techniques and completeness.  Bruce will also discuss unloaded hours vs. loaded hours and why estimates should be done in raw hours.  You will review a summary of common counting techniques that can be used to quantify work scope and customized to a specific project.  Finally, Bruce will lead a discussion of how to recognize repetitive patterns in project work items, within and across projects, and how to leverage these patterns.

  • Discover how to use Model Based Effort Estimation
  • Learn to recognize and leverage repetitive work patterns
  • Realize the power of common counting techniques

The Yin and Yang of Metrics

Thursday, November 8, 2012: 3:30 PM - 4:30 PM

Metrics are powerful tools, but are often situation dependent. Insightful metrics for a waterfall approach may not be productive in an agile scenario. In this presentation, QA and testing expert Shaun Bradshaw will discuss the value and pitfalls of various metrics in alternate development methodologies. Shaun will take the traditional test manager’s role with a focus on waterfall, and then approach metrics from an agile perspective. Join Shaun for a lively debate and learn how certain metrics work and don’t work in the waterfall versus agile realm, that there are universal metrics that provide value in both methodologies, how metrics evolve with methodologies, and what to keep and what to discard as your organization grows.

  • Appreciate the value of metrics and how to utilize them in alternative development methodologies
  • Understand the differences and similarities between metrics for agile versus traditional waterfall methodologies
  • Be aware of potential metric pitfalls as well as what to track and what not to track