QAI Conferences Home | QAI Home Page | Federated Chapters
QuEST Chicago
 
TesTrek 2010 Home
General Information
Manager's Workshop
TesTrek at a Glance
Register
Revisit Past TesTreks
 


TesTrek Workshops

*You can jump ahead to any selection by clicking the DATE or the session TITLE


WEDNESDAY, OCTOBER 20 - MORNING

Metrics and Measurement in Software Testing
Vahid Garousi, PhD, University of Calgary

Track 1: 11:00- 12:30

Similar to other stages and activities in software engineering, metrics are an important means that are vital to the success of software testing.  Software testing metrics can be classified into several subcategories: product, test suite, process/progress, resource, defect, and management.  In this workshop, the different software testing metrics will be discussed and real-world examples will be presented.  Leading-edge approaches including agile methodologies, test-driven development (TDD), and automated testing will be explored.  Today, there are many resources available on testing metrics and measurement, but each only provides a limited view on the topic. This workshop aims at providing a consistent view of all related metrics.  The latest trends from both industry and academic contexts will be presented.

  1. Become familiar with different types of software testing metrics
  2. Learn to apply those metrics to conduct measurements in software testing projects
  3. Be able to use the metrics and measurements to apply process improvement practices

About the Workshop Leader...
Vahid Garousi, PhD is an Assistant Professor of Software Engineering at the Department of Electrical and Computer Engineering of the University of Calgary.  He conducts research, teaches university courses and provides industrial training and coaching in the areas of software testing, UML, OO development, and agile software development.  Vahid received a PhD in Software Engineering from Carleton University.  His PhD work centered on performance testing of distributed real-time systems.  His MSc degree is in Electrical and Computer Engineering. During his graduate studies, he worked on projects with Bell Mobility, Siemens, and IBM. He is a member of the IEEE and the IEEE Computer Society, and is also a licensed professional engineer (PEng) in Canada.

 Back to top

Tour-Based Testing: The Hacker's Landmark Tour
Rafal Los, Hewlett-Packard

Track 2: 11:00 - 12:30

When visiting a new city, people often take an organized tour, going from landmark to landmark to get an overview of the town. Taking a "tour" of an application, going from function to  function, is a good way to break down the testing effort into manageable chunks.  Not only is this approach useful in functional testing, it's also effective for security testing.  In this presentation, Rafal takes you inside the hacker's world, identifying the landmarks hackers target within applications and showing you how to identify the defects they seek out.  Learn what "landmarks" are, how to identify them from functional specifications, and how to tailor negative testing strategies to different landmark categories.  Test teams, already choked for time and resources and now saddled with security testing,  will learn how to pinpoint the defect from the mountains of  vulnerabilities often uncovered in security testing, that could  compromise the entire application.

  • Why do hackers attack our sites?
  •  How and why do QA testers figure into "security testing"?
  • Thinking like a hacker, incorporate new strategies in old testing

About the Workshop Leader...
Rafal Los, Web Application Security SME with Hewlett-Packard's Application Security Center, is a ten year industry veteran who has worked in a variety of security positions from consultant to Information Security Officer within the Fortune 100.  Rafal's unique blend of technical expertise and business knowledge enable him to teach audiences about security  techniques, programs, and processes that they can both understand strategically and apply realistically.  He has extensive experience in security testing, risk analysis and management, penetration testing, and architecture and policy.  Rafal is an accomplished writer and speaker maintaining 2 popular blogs as well as numerous appearances at software quality and security conferences.

 Back to top

Beyond the Templates: Test Planning in Context
Lynn McKee, Quality Perspectives

Track 3: 11:00- 12:30

Test planning is an important element in all testing efforts, although it appears in many forms. For some projects, test planning is formal exercise focused on the development of a comprehensive, pre-emptive document.  For other projects, test planning is a dynamic, lightweight exercise constantly adapting to the changes in the project and organizational needs. Templates are often developed for test planning in an attempt to assure “best practices” and standardization.  It is important to consider that this standardization may restrict our ability to apply critical and creative thinking and to effectively respond to specific circumstances.  Testers need to focus on effective, context driven test planning.  Beyond the confines of templates and standardization, the objective remains to define a strategy for gathering insightful and timely quality-related information for stakeholders.  Context driven test planning aligns the test strategy to the unique organizational goals and project specific constraints, and emphasizes the importance of adaptability.

  • Review the diverse use of the term “test planning” and common approaches to the subject
  • Examine the challenges in using cookie cutter test planning templates and approaches
  • Share the concepts behind context driven testing and the importance of adaptability

About the Workshop Leader...
Lynn McKee is an independent consultant with 15 years experience in the IT industry and a passion for helping organizations, teams, and individuals deliver valuable software.  Lynn provides consulting on software quality, testing, and building high performing teams.  An advocate of the context-driven perspective, her focus is on ensuring testing teams are enabled with effective, adaptive, and scalable approaches aligned with the organization's quality needs. Lynn is an active member of numerous software testing associations, speaks at conferences, writes articles, and contributes to blogs and forums. Lynn is the co-founder of the Calgary Perspectives on Software Testing Workshop.  

 Back to top

WEDNESDAY, OCTOBER 20 - AFTERNOON

Test Process Improvement: Lessons Learned From the Trenches
Peter Walen, ISD Corporation

Track 1: 3:00 - 4:30

Testers are under ever greater pressure to increase depth, coverage and quality of testing in shorter amounts of time.  Test leads and managers will often turn to the process of testing itself to gain these improvements.  Even when testing is “good,” the question becomes, “Is it good enough?”  Can it be improved without disrupting what the test team is already doing well?  Are there other factors that may be impacting the testing effort? We will look at what can be done to help the testers themselves as well as what others may do to allow the testers to be more efficient.  Using case studies as examples, we will investigate what has worked and what has not worked for the presenter and participants.  We will then explore ways it may have been done better.

  • Assess the current state of testing
  • Discover solutions to improve the test process
  • Communicate, gain buy-in and implement changes

About the Workshop Leader...
Peter Walen has been in software development for over 25 years.  After working many years as a programmer, he moved to software testing and QA.  Following a brief foray in Project Management and Business Analysis, he returned to software testing.  He has worked in the fields of Insurance and Finance, Manufacturing, Higher Education/Universities, Retail, Distribution and Point Of Sale Systems.  Peter is an active member of several testing associations, an active blogger on software testing and a moderator on SQAForums.com.

 Back to top

In Pursuit of Business Value: The Relevance of Emerging Trends
Clint Sprauve, Micro Focus

Track 2: 3:00 - 4:30

Why do “emerging trends” never stop emerging?  We adopt the latest software development method and immediately there is a new one.  Perhaps this trend occurs because we have still not solved the software project success problem.  So, we keep trying new things to successfully align IT with the business and bring about successful software implementations.  New approaches like agile are gaining popularity but, is it working?  Analysts are talking about “driving quality” from the start of the lifecycle which is logical but, how can we execute on this?  To what extent is your organization tracking these emerging trends and how can you determine if adoption is optimizing business value?

  1. Software project trends and their business relevance
  2. Emerging trends driving change for quality and development professionals
  3. The business relevance or irrelevance of process models

About the Workshop Leader...
A senior solutions engineer for the Silk Quality Solutions at Micro Focus, Clinton Sprauve has more than fifteen years of experience in the software quality assurance industry. Previously he was the senior product marketing manager for SilkCentral Test Manager at Borland/Segue Software, and served as a senior technical sales engineer for both companies. Clint also has been an independent consultant specializing in test management and test automation.

 Back to top

Risk Based Testing on Steroids!
Todd Kuczaj and Zach Gallentine, Accenture

Track 3: 3:00 - 4:30

As organizations strive to speed up software development in an effort to reduce time-to-market, there is significant probability that software development processes will be compromised to "make the date."  These compromises always seem to have a negative effect on the test team who has the difficult, if not impossible, task of helping ensure a quality release before shipping. Todd and Zach will discuss ways test managers can implement risk-based testing and describe fundamental quality & risk management techniques that can be leveraged throughout the development life cycle.  Learn the essentials of managing risk, identification, analysis, prioritization, response planning, resolution, and monitoring.  And, learn the basics of risk-based testing, what it is, why it's relevant to testing, how to implement it in your organization, and how to apply it throughout the software development lifecycle.

  • Explore the problems created by high-speed development
  • Learn the essentials of managing risk
  • Understand the basics of risk-based testing

About the Workshop Leader...
Todd Kuczaj is a Senior Manager within Accenture's Global Testing Practice.  He has been with the company for over 9 years, almost exclusively in testing and quality roles. Currently, he is the overall program lead for a new 250-person centralized testing capability at a large US financial services firm.  In addition to this role, Todd also conducts test assessments and provides strategic insights for various other Accenture clients. He holds a Master of Science in Information Management from Arizona State University and a Bachelor of Arts from the University of Notre Dame.

Zach Gallentine is also a consultant within Accenture's Global Testing Practice.  He has been with the company for 5 years, exclusively in testing/quality roles, including test automation, test tools implementation, and applied statistics testing. Currently, he is helping lead the implementation of testing process improvements at a large US financial services firm.  He holds a Bachelor of Science in Computer Science from Taylor University.

 Back to top

THURSDAY, OCTOBER 21 - MORNING

A Model for Test Automation Success
Nazar Hossain, Accenture

Track 1: 11:00 - 12:30

Automated testing has had strong visibility in recent years.  A combination of matured technology and increased pressures on quality assurance teams to deliver consistently on budget and on time has put a greater focus on leveraging automated testing tools as part of an overall QA program.   But, just having the right tool is often not enough.  A successful automated testing practice also involves individuals across an IT enterprise being closely integrated with a robust governance model and processes.  A successful model covers the upfront identification and prioritization of scope, script development activities, and measurement of the final actual vs. planned ROI.  Today's successful automated testing capability encompasses all these factors to consistently deliver a net ROI to a QA organization, both in terms of cost savings and improvements in speed to market.

About the speaker...
Nazar Hossain leads Accenture’s Global Automated Testing Center of Excellence capability group.  The group has grown to over 100 resources in the past year with over 1,000 resources automating scripts on any given day.  Nazar has personally worked with clients across a broad range of industry groups including Financial Services, Consumer and Retail, Public Services, and Telecommunications over the past 5 years.

 Back to top

Effectively Managing the Testing Process Through Collaboration
Debra Forysth and Dave Lloyd, ObjectSharp Consulting

Track 2: 11:00- 12:30

Communications is the key to the success or failure of any team, whether it’s a sports team or a software development team creating the next app. Without communication, the team members have no idea how best to direct their efforts. With communication, the team can overcome almost any challenge.  Unfortunately, ensuring that the level of collaboration required in generating success can be a difficult goal to achieve. Solid development managers have a bag of tricks that they utilize to foster communications among the different roles. In this session, you will learn some of the tricks that can be used in your own environment to create better communications across all of the team roles.

  • How can the other team roles more effectively communicate with the testing team?
  • What does the test team need to know to complete the testing effort?

About the Workshop Leaders...
Debra Forysth is the Quality Assurance Practice Lead and a Senior QA Consultant with ObjectSharp Consulting.  As QA Practice Lead, Debra is an experienced Visual Studio 2010 and Microsoft Test Manager instructor and consultant, providing training and support to clients.   She excels in facilitating all phases of systems development ranging from object-oriented analysis, design, and development through testing and ongoing project management.  Debra is a dynamic communicator who is experienced with the design and execution of comprehensive quality assurance processes, with proven qualifications in test life cycle, strategy/plan, case/data design, risk assessment, functional/performance automated tools implementation, environment set up, defect, and configuration management.

Dave Lloyd is a senior partner and co-founder of ObjectSharp.  He has 25 years experience in the IT industry designing and building software solutions for a large number of clients in varying industries.  Dave is a seasoned project manager with extensive experience implementing process into development teams, from small and large ISV’s to in house development groups.  Dave has also spent significant time during his career in implementing test solutions for clients working with the most current automated test tools and implementing successful testing environments.  Dave also brings with him over 20 years of teaching experience.

 Back to top

Parallel Data Testing: The Next Frontier of Quality Assurance
Daniel Dopp and Angsuman Dutta, Infogix

Track 3: 11:00 - 12:30

Several methods exist for functional and performance testing (F&PT) of software systems and processes, however, fewer options exist to ensure the accuracy, consistency and reliability of the data itself. Unlike F&PT, data testing cannot be performed prior to the data conversion or movement. Post conversion data testing is costly due to the time and effort it takes to correct an issue and retest new results. Therefore Parallel Data Testing (PDT) is the most efficient and effective approach to ensure quality. The workshop will cover PDT concepts and framework, common PDT controls, and a case study where attendees participate in a facilitated discussion on the PDT needed required in the case study.

Learning Objectives

  • Learn an Enterprise Level Testing Framework for ensuring data integrity
  • Learn types of information controls required to prevent and detect data errors during the testing/movement process
  • Understand the cost benefit analysis of performing PDT

About the Workshop Leaders...
Dopp is a thought leader in the emerging information integrity space and has assisted numerous Fortune 500 enterprises in developing and implementing strategies to ensure integrity of data testing through the implementation of information controls. He specializes in assessing information risks and performing cost benefit analysis. In his current role, Dan is responsible for market awareness of automated independent controls to prevent and detect information errors. Before Infogix, Dan held positions at Zurich American Insurance, SunGuard Investment Systems, and The Northern Trust Company. Dan earned BS.in finance from Illinois State University, and an MBA in business administration from DePaul University.

Dutta is an expert in information risk assessment and control implementation. He has assisted numerous enterprises in defining data testing requirements for system conversion and migration projects, and lead projects to define operational data (such as ETL processes) testing criteria and processes. Additionally, Dutta has authored and published numerous articles in trade journals including ISACA journal, ASQ, and the Information Management journal. These have led to him presenting a number of national and international conferences in USA and abroad. Dutta has a MS. in Computer Science from the Illinois Institute of Technology, and a MBA in Analytical Finance from the University of Chicago.

 Back to top

THURSDAY, OCTOBER 21 - AFTERNOON

Requirements Testing Techniques for Today's Super QA Analyst
Erica Frazier, Wells Fargo and Melandee Jones, Ally Financial

Track 1: 3:00 - 4:30

How does a QA Analyst stay competitive in a global market during a recession?  How can one prove the value of the QA role in delivering reliable business applications which satisfy requirements?  The answer is rather straight forward; it’s simply by morphing into the Super Analyst.  All professions and roles grow, and our profession of quality assurance is no different.  Our role today is broader; expanding into more technical skill sets, and has more depth, with the ability to dig into the details.  In this workshop you will learn how to test for more than just functional requirements.  Specifically, we will address expanding the scope of requirements-based testing to include architectural, system / non-functional, data, and other technical requirements.  Join us as we reveal the exciting new world of the Super QA Analyst.  Participate in the hands-on test scenarios that will enable you to try out this new role.

  • An Introduction to Requirements Based Testing
  • Creating Scripts and Cases from Requirements
  • Negative Testing, Data Integrity, and the Traceability Matrix

About the Workshop Leaders...
Erica Frazier brings over twelve years of analytical experience in business and technical roles. She is a results oriented professional with  demonstrated skill in leading for success.  In her current role she serves as Vice President of Treasury Risk Management with Wells Fargo Corporation.  She obtained her Master’s of Science in Organization Leadership from Pfeiffer University and is currently pursuing a Master’s of Business Administration with a concentration in Global Management.  Erica is regarded in the industry as a "get it done person" because of her passion for people, process, and service excellence.

Melandee Jones is a Charlotte, North Carolina based Senior Systems Analyst with Ally Financial.  Melandee has a decade of experience and has specifically held the quality positions of Project Test Manager and Quality Assurance Lead, among other technical roles.  She constantly stresses the instrumental phase of testing in all of her projects.  Melandee has spoken at several regional events on topics of building reliable, quality software.  Melandee is currently pursuing her MBA, and continually preaches the role of the Super Analyst.

 Back to top

Driving to Agile Success through Acceptance Test Driven Development
Declan Whelan, Whelan & Associates

Track 2: 3:00 - 4:30

How do you know that your agile team is building the right thing?  If they aren't, how long does it take to figure that out?  Acceptance testing helps to ensure that your team does build the right thing and provides rapid feedback if the system no longer behaves as expected.  Automated acceptance tests clarify agile stories through concrete examples and provide ongoing assurance that the system behaves as expected.  Acceptance tests for agile stories help teams better estimate and plan their work.  But be warned, acceptance test driven development is hard work and should only be adopted by ‘test infected' teams that are already doing test driven development.  In this workshop, see examples of automated acceptance tests including the key characteristics of good tests.  Practice writing acceptance tests and explore how automated acceptance tests fit your strategy.  Finally, learn about tools to use for ATDD and examine the pros and cons of the technique.

  • Discover the benefits and drawbacks of acceptance test driven development
  • Learn the techniques for using this method successfully
  • Explore toolsets and strategies for implementing ATDD

About the Workshop Leader...
Declan Whelan is an active software developer and agile coach.  He is a professional engineer with25 years of experience in the software industry supporting many types of businesses including financial, medical, educational, manufacturing, and utilities.  He was co-founder and CTO for Innovasys, a start-up company that developed electronic imaging and workflow products for the financial market.  He successfully transitioned the company from a start-up through to profitable venture and eventual sale.  Declan is a certified Scrum Master and a member of the IEEE Computer Society, Agile Alliance, and Scrum Alliance.  Declan's focus is on working in the trenches with teams to deliver better value, quality, and time-to-market through agile principles and practices.

 Back to top

The Road to Successful Exploratory Testing
Paul Carvalho, Software Testing and Quality Services (STAQS)

Track 3: 3:00 - 4:30

You heard the hype about Exploratory Testing (ET) and decided to try it only to discover that it didn't work for you. It was a waste of time. Perhaps there was confusion and miscommunication. Perhaps there was resistance from some of your testers or other project team members to a new testing approach that doesn't have clear test documentation. Perhaps you just didn't see any noticeable gains from using this testing approach.  So, what went wrong? Good ET requires commitment, training, leadership, dedication and a new way of looking at the testing problem.  It changes how the tester interacts with the system and project team members, and requires a new way of managing the testing. Yes, both tester and manager are affected.   In this workshop you will have an opportunity to share your stories or concerns, compare and contrast ET with other traditional approaches, and discuss critical success factors required to help your testing effort succeed.

  • Good ET requires commitment, training, leadership, and dedication from testers and managers
  • ET requires changes in how you manage the testing effort
  • Testers must change how they interact with the project team members and how they track testing

About the Workshop Leader...
Paul Carvalho has been in the software industry for over 20 years, with roles in testing, management, software quality assurance, development, training and support.  He has worked on a range of technologies in the financial services, telecommunications, healthcare, logistics, national defense, scientific, engineering and commercial software fields.  Paul is a specialist in systems analysis, test design and quality assurance, and has published articles on test techniques and hiring software testers.  Paul is a practitioner, coach and presenter of methods in exploratory testing.  He has worked exclusively with ET for over 7 years with a focus on black box, internationalization, security and performance testing.

 Back to top





QAI Global Institute - Copyright © 2010
Privacy Policy - Terms of Use - www.qaitestrek.org