Br eaking S oftware - StarWest

Transcription

Br eaking S oftware - StarWest
35
Br eaking
16
S oftware
OCTOBER 12–17, 2014
A N A HEIM, CA
DI SNEYLAND HOTEL
STA RW E ST.T EC H W E L L .CO M
REGISTER BY
SEPTEMBER 12, 2014 AND
SAVE UP TO $200!
gro
ore
ups
m
n
o f 3 + s ave eve
AN INVITATION FROM THE PROGRAM CHAIR
On behalf of TechWell, I’d like to invite you to join us for a knowledge-expanding and career-building
experience in Anaheim at the 23rd annual Software Testing Analysis and Review (STAR) conference.
The tester’s world is changing, and today we are facing new challenges, pressures, and opportunities.
The conference helps you learn both classical testing practices and new methodologies to grow your
skills, supercharge your knowledge, and re-energize how you view your profession.
You’ll have the opportunity to learn from thought leaders in the testing industry and chat with them
in person about your challenges. Plus, Anaheim is a great host city for the conference with all its
entertainment venues. Please join us this October at STARWEST!
Regards,
Lee Copeland
Program Chair, STARWEST
A WIDE VARIETY OF TESTING TOPICS
What’s happening now in software testing? STARWEST offers a
wide variety of testing topics at the conference:
CONTENTS
4
Conference Schedule
MOBILE TESTING
6
H
otel Spotlight
TEST MANAGEMENT
7
Networking & Special Events
8
Training Classes
10
Tutorials
16
Keynotes
TEST TECHNIQUES
18
Concurrent Sessions
METRICS
25
Bonus Sessions
TEST AUTOMATION
26
T
esting & Quality Leadership Summit
28
The Expo
29
Exhibitors, Sponsors, & Partners
30
Ways to Save
CLOUD TESTING
31
Registration & Pricing Details
PERSONAL IMPROVEMENT
CONTINUOUS DELIVERY
REQUIREMENTS
AGILE TESTING
SECURITY
WHO’S BEHIND THE CONFERENCE?
Learn.Share.Connect—TechWell.com is brought to you by Software Quality Engineering (SQE). A leader
in the software industry for twenty-eight years, SQE delivers a variety of software training, conferences,
publications, consulting, and website communities. www.TechWell.com
STAY CONNECTED
Stay up-to-date on all of the latest TechWell happenings—including conferences, training, publishing, and
other valuable resources for the software industry. Join our mailing list at: http://vlt.me/connect
Join the social conversation @TechWell or #starwest!
2
CALL
888.268.8770
OR
904.278.0524
TO
REGISTER
•
S TA R W E S T. T E C H W E L L . C O M
CONFERENCE OVERVIEW
Build your own conference—training classes, tutorials, keynotes, concurrent sessions, the Leadership Summit, and
more—packed with information covering the latest technologies, trends, and practices in software testing.
SUNDAY
Agile Tester Certification—ICAgile (2 days)
Fundamentals of Agile Certification—ICAgile (2 days)
Mobile Application Testing (2 Days)
Software Tester Certification—Foundation Level (3 days)
Real-World Software Testing with Microsoft Visual Studio® (3 days)
Mastering HP LoadRunner® for Performance Testing (3 days)
MONDAY–TUESDAY
Multi-Day Training Classes Continue
Requirements-Based Testing (2 days)
34 In-Depth Half- and Full-Day Tutorials
WEDNESDAY–THURSDAY
5 Keynotes
42 Concurrent Sessions
The Expo
Networking and Special Events
Test Lab
…and More!
FRIDAY
Testing & Quality Leadership Summit
Attend the Testing & Quality Leadership Summit Thursday evening and Friday. Join senior leaders from the
industry to gain new perspectives and share ideas on today’s software testing issues. See page 26 for more
information on the Testing & Quality Leadership Summit. (Summit registration required)
Workshop on Regulated Software Testing (WREST)
See page 25 for more information. (Free, but pre-registration required)
CHOOSE THE CONFERENCE PACKAGE THAT WORKS BEST FOR YOUR BUDGET
MOST
B I G S AV E R
S M A R T S AV E R
1 Tutorial Day
Conference Only
Conference + 2 Days
$945
$1,895
$2,345
For a complete list of pricing options, go to: http://starwest.techwell.com
Prices valid when you register by September 12th
CALL
888.268.8770
OR
904.278.0524
TO
REGISTER
•
S TA R W E S T. T E C H W E L L . C O M
R
POPULA
S U P E R S AV E R
3
CONFERENCE SCHEDULE
SUNDAY, OCTOBER 12
8:30
Multi-day training classes begin: Agile Tester Certification—ICAgile • Mobile Application Testing • Software Tester Certification—Foundation
Level • Real-World Software Testing with Microsoft Visual Studio® • Fundamentals of Agile Certification—ICAgile • Mastering HP LoadRunner®
for Performance Testing
MONDAY, OCTOBER 13
8:30
Requirements-Based Testing • Multi-day training classes continue from Sunday (8:30am–5:00pm)
8:30
Tutorials (8:30am–12:00pm)
MONDAY MORNING TUTORIALS
MONDAY FULL DAY TUTORIALS
MAThe Challenges of BIG Testing: Automation,
Virtualization, Outsourcing, and More
ME What’s Your Leadership IQ?
Jennifer Bonine, tap|QA, Inc.
MF Application Performance Testing: A Simplified Universal Approach—Scott Barber, SmartBear
MGMeasurement and Metrics for Test Managers—Rick Craig, Software Quality Engineering
MHTake a Test Drive: Acceptance Test-Driven Development NEW Jared Richardson, Agile Artisans
MI A Dozen Keys to Agile Testing Maturity—Bob Galen, Velocity Partners, and Mary Thorn, ChannelAdvisor
MJ Applying Emotional Intelligence to Testing NEW
Thomas McCoy, Australian Department of Social Services
NEW
Hans Buwalda, LogiGear
MB A Rapid Introduction to Rapid Software Testing
Michael Bolton, DevelopSense
MC Getting Started with Risk-Based Testing
Dale Perry, Software Quality Engineering
MDIntroduction to Selenium and WebDriver
12:00
1:00
NEW
Alan Richardson, Compendium Developments
Lunch
Tutorials (1:00pm–4:30pm)
MONDAY AFTERNOON TUTORIALS
MONDAY FULL DAY TUTORIALS (CONTINUED)
MAThe Challenges of BIG Testing: Automation,
Virtualization, Outsourcing, and More
MK Being Creative: A Visual Testing Workshop NEW Andy Glover, Exco InTouch
ML Innovation Thinking: Evolve and Expand Your Capabilities—Jennifer Bonine, tap|QA, Inc.
MMSatisfying Auditors: Plans and Evidence in a Regulated Environment NEW
James Christie, Claro Testing
MNTest Automation Patterns: Issues and Solutions
Hans Buwalda, LogiGear
MB A Rapid Introduction to Rapid Software Testing
Michael Bolton, DevelopSense
MC Getting Started with Risk-Based Testing
Dale Perry, Software Quality Engineering
MDIntroduction to Selenium and WebDriver
Seretta Gamba, Steria Mummert ISS GmbH, and Mark Fewster, Grove Consultants
MOExploratory Testing Explained—Paul Holland, Doran Jones, Inc.
MP Jon Bach: On Testing NEW Jon Bach, eBay, Inc.
NEW
Alan Richardson, Compendium Developments
TUESDAY, OCTOBER 14
8:30
Multi-day training classes continue (8:30am–5:00pm)
8:30
Tutorials (8:30am–12:00pm)
TUESDAY MORNING TUTORIALS
TUESDAY FULL DAY TUTORIALS
TA Critical Thinking for Software Testers
Michael Bolton, DevelopSense
TB
Successful Test Automation:
A Manager’s View
Mark Fewster, Grove Consultants
TC
TD
TE
TF
TG
TH
Fundamental Test Design Techniques NEW Lee Copeland, Software Quality Engineering
Exploratory Testing Is Now in Session—Jon Bach, eBay, Inc.
Integrating Automated Testing into DevOps NEW Jeff Payne, Coveros
Essential Test Management and Planning—Rick Craig, Software Quality Engineering
Testing Cloud Services NEW Martin Pol and Jeroen Mengerink, Polteq Test Services B.V.
Planning, Architecting, Implementing, and Measuring Automation NEW
Mike Sowers, Software Quality Engineering
TI Test Management for Large, Multi-Project Programs NEW Geoff Horne, NZTester magazine
TJ Exploring Usability Testing for Mobile and Web Technologies—Rob Sabourin, AmiBug.com, Inc.
12:00
1:00
Lunch
Tutorials (1:00pm–4:30pm)
TUESDAY AFTERNOON TUTORIALS
TUESDAY FULL DAY TUTORIALS (CONTINUED)
TA Critical Thinking for Software Testers
Michael Bolton, DevelopSense
TB
Successful Test Automation:
A Manager’s View
Mark Fewster, Grove Consultants
TK Pairwise Testing Explained NEW Lee Copeland, Software Quality Engineering
TL Security Testing for Test Professionals—Jeff Payne, Coveros, Inc.
TM End-to-End Testing with the Heuristic Software Test Model NEW Paul Holland, Doran Jones, Inc.
TN Testing the Data Warehouse—Big Data, Big Problems—Geoff Horne, NZTester magazine
TO Getting Your Message Across: Communication Skills for Testers NEW
Thomas McCoy, Australian Department of Social Services
TP Introducing Keyword-Driven Test Automation—Hans Buwalda, LogiGear
TQ Test Estimation in Practice—Rob Sabourin, AmiBug.com
TR Test Automation Strategies for the Agile World NEW Bob Galen, Velocity Partners
4:30
Welcome Reception (4:30pm-6:30pm)
6:30
Bonus Session—Speaking 101: Tips and Tricks (6:30pm–7:30pm)
4
CALL
888.268.8770
OR
904.278.0524
TO
REGISTER
•
S TA R W E S T. T E C H W E L L . C O M
WEDNESDAY, OCTOBER 15
7:15
BREAKFAST BONUS SESSION: Service Virtualization and Continuous Integration—Sponsored by Cognizant
8:45
KEYNOTE: Quality Principles for Today’s “Glueware”—Testing Web Services, Libraries, and Frameworks—Julie Gardiner, Redmind
10:00
KEYNOTE: Balancing the Crusty and Old with the Shiny and New—Bob Galen, Velocity Partners
11:00
Networking Break • Visit the Expo, 10:30am–2:00pm
Test Management
11:30
W Building Quality In:
1 Adopting the Tester’s
Mindset
Stephen Vance,
Stellar Advances
Rob Sabourin,
AmiBug.com
Agile Testing
Personal Excellence
Special Topics
W A Tester’s Guide to
4 Collaborating with
W Growing into Leadership
5 Pete Walen,
W Testing Compliance with
6 Accessibility Guidelines
Jim Trentadue,
Ranorex
Product Owners
Bob Galen, Velocity
Partners
Software Testing &
Anthropology
Anish Krishnan,
Hexaware
Technologies, Ltd.
W The Role of Testing:
7 Quality Police or Quality
W Virtualization: Improve
8 Speed and Increase
W Functional Testing
9 with Domain-Specific
W Agile Development and
10 Testing in a Regulated
W Adventures of a Social
11 Tester
W Test Improvement: In Our
12 Rapidly Changing World
W The Test Manager’s Role
13 in Agile: Balancing the
W Testing the New Disney
14 World Website
W End-to-End Test
15 Automation with Open
W Your Team’s Not Agile If
16 You’re Not Doing Agile
W Speak Like a Test
17 Manager
W Implementing
18 Outsourced Testing
Communicator?
Mike Duskis,
10-4 Systems
3:00
Test Automation
W Why Automation Fails—
3 in Theory and in Practice
Lunch • Visit the Expo • Meet the Speakers
12:30
1:45
Test Techniques
W Testing Lessons Learned
2 from Sesame Street
Old and the New
Mary Thorn,
ChannelAdvisor
Quality
Clint Sprauve, HP
Les Honniball,
Walt Disney Parks and
Resorts Technology
Languages
Tariq King,
Ultimate Software
Source Technologies
Ramandeep Singh,
QA InfoTech
Environment
John Pasko,
Karl Storz Imaging
Testing
Jeanne Schmidt,
Rural Sourcing, Inc.
4:00
Networking Break • Visit the Expo, 3:30pm–6:30pm
4:30
KEYNOTE: Lightning Strikes the Keynotes—facilitated by Lee Copeland, Software Quality Engineering
5:30
Reception in the Expo Hall, 5:30pm–6:30pm
Martin Nilsson,
House of Test
Michael Sowers,
Software Quality
Engineering
Martin Pol, Polteq
Testing Services BV
Services with a Third
Party
Shelley Reuger,
Moxie Software
THURSDAY, OCTOBER 16
8:30
KEYNOTE: Softwarts: Security Testing for Muggles—Paco Hope, Cigital
Test Management
9:45
T
1
12:15
T
2
T
7
Leading InternationallyDistributed Test Teams
Dennis Pikora,
Symantec
T
8
T The Unfortunate Triumph
13 of Process over Purpose
James Christie,
Claro Testing
2:30
3:00
Test Automation
Mobile Testing
Continuous Delivery
A Path through the
Jungle: Validating a Test
Automation System for
the FDA
Chris Crapo and
David Nelson,
Boston Scientific
Neuromodulation
Top Ten Attacks to Break
Mobile Apps
Jon Hagar, Grand
Software Testing
Using DevOps to Improve
Software Quality in the
Cloud
Jeff Payne, Coveros
T
3
T
4
T
5
Metrics
T
6
Testers, Use Metrics
Wisely or Don’t Use
Them at All
Deborah Kennedy,
Aditi Technologies
A Feedback-Driven
Framework for Testing
Erik Petersen,
emprove
T
9
Automation Abstractions: T
Page Objects and Beyond 10
Alan Richardson,
Compendium
Developments
Bridging the Gap in
Mobile App Quality
Costa Avradopoulos,
Capgemini Consulting
T Checking Performance
11 along Your Build
Pipeline
Andreas Grabner,
Compuware
T Metrics That Matter
12 Pablo Garcia,
Redmind
Lunch • Visit the Expo • Meet the Speakers
Test Management
1:30
Test Techniques
Release the Monkeys:
Testing Using the Netflix
Simian Army
Gareth Bowles, Netflix
Networking Break • Visit the Expo, 10:30am– 3:00pm
10:45
11:15
“Rainmaking” for Test
Managers
Julie Gardiner,
Redmind
Test Techniques
T Speed Up Testing Using
14 Monitoring Tools
Jim Hirschauer,
AppDynamics
Test Automation
T Making Your Test
15 Automation Transparent
Subodh Parulekar,
AFour Technologies,
Inc.
Mobile Testing
Performance Testing
Security
T Ensuring the
16 Performance of Mobile
T Build Your Custom
17 Performance Testing
T Testing Application
18 Security: The Hacker
T Five Ways to Improve
22 Your Mobile Testing
T Modeling System
23 Performance with
T Testing API Security:
24 A Wizard’s Guide
Apps—on Every Device
and Network
Steve Weisfeldt, Neotys
Framework
Prashant Suri
Rackspace
Psyche Exposed
Mike Benkovich
Imagine Technologies, Inc.
Networking Break • Visit the Expo (closes at 3:00pm)
T Before You Test Your
19 System, Test Your
Assumptions
Aaron Sanders,
Agile Coach
T User Acceptance Testing
20 in the Testing Center of
Excellence
Deepika Mamnani,
Hexaware
Technologies
T The Doctor Is In:
21 Diagnosing Test
Automation Diseases
Seretta Gamba, Steria
Mummert ISS GmbH
Dennis Schultz, IBM
Production Data
William Hurley, Astadia
4:15
KEYNOTE: Why Testers Need to Code: Facebook’s No Testing Department Approach—Simon Stewart, Facebook
5:30
Testing & Quality Leadership Summit Reception, 5:30pm–6:30pm (Summit registration required)
Ole Lensmar,
SmartBear Software
FRIDAY, OCTOBER 17
Testing & Quality Leadership Summit
Attend the Testing & Quality Leadership Summit Thursday (5:30pm) and Friday (all day). Join senior leaders
from the industry to gain new perspectives and share ideas on today’s software testing issues. See page 26 for
more information on the Testing & Quality Leadership Summit. (Summit registration required)
Workshop on Regulated Software Testing (WREST) See page 25 for more information.
(Free, but pre-registration required)
CALL
888.268.8770
OR
904.278.0524
TO
REGISTER
•
S TA R W E S T. T E C H W E L L . C O M
5
HOTEL SPOTLIGHT
EXCLUSIVE
R AT E S
at the
D i s n ey l a n d
Hotel
STARWEST 2014 will be held at the
recently renovated Disneyland Hotel,
featuring all new luxurious guest
rooms with the comfort and amenities
that the business traveler has come
to expect. Soak up the California
sun after your meetings in one of
the three renovated pools or relax at
either of the outdoor hot tubs. Look
forward to experiencing legendary
quality and outstanding Cast Member
service at the Disneyland Hotel.
Special Hotel Rates for STARWEST Attendees
Book your room reservation at the Disneyland Hotel at
the exclusive conference rate by September 28, 2014.
Space is limited, so please reserve your room early! Use
one of these options to make a reservation:
• CALL DISNEY®!—Call Disneyland Hotel reservations
at 714.520.5005, available Mon–Fri from 8am–5pm
PST. When calling, be sure to mention the
STARWEST conference to get the special conference
rate. If you need special facilities or services, please
notify the agent at the time of reservation.
Stay at the Center of
the Action
Networking opportunities will be
around every corner and inside
every elevator at the Disneyland
Hotel. Save time getting to and
from the sessions and exhibits—
while enjoying the convenience of
going back to your room between
events to make phone calls and
check emails. Plus, you’re just
footsteps away from additional
dining and entertainment at
Downtown Disney® and the two
Disney® theme parks!
6
CALL
888.268.8770
®
OR
904.278.0524
TO
• BOOK ONLINE—To book your hotel online or
view the special conference room rates, go to
http://vlt.me/sw14hotel.
• CALL US!—Call our Client Support Group at
888.268.8770.
Disneyland Hotel is located at:
1150 West Magic Way
Anaheim, CA 92802
714.520.5005
* Cancellations on a guaranteed reservation must occur more
than 5 days prior to the specified arrival time to ensure a refund.
REGISTER
•
S TA R W E S T. T E C H W E L L . C O M
NETWORKING EVENTS
Welcome Reception
STARWEST Test Lab
Tuesday, October 14 • 4:30–6:30pm
Wednesday, October 15–Thursday, October 16
Kick off the conference with a welcome reception!
Mingle with experts and colleagues while enjoying
complimentary food and beverages.
Visit the interactive STARWEST Test Lab to practice
the skills and techniques you’re learning at the
conference. Compete with your fellow testers to find
bugs, join speakers to practice skills and techniques
presented in class, participate in discussion groups,
and more!
Expo Reception
Wednesday, October 15 • 5:30–6:30pm
Network with peers at the Expo reception and enjoy
complimentary food and beverages. Be sure to play
the Passport game for your chance to win great
prizes!
Meet the Speakers at Lunch
Wednesday, October 15–Thursday, October 16
During Lunch
Meet with industry experts for open discussions in key
areas of software testing. On both days, there will be lunch
tables designated by topic of interest. Come pose your
toughest questions!
Presenter One-on-One
Wednesday, October 15–Thursday, October 16
STARWEST offers the unique opportunity to schedule
a 15-minute, one-on-one session with a STARWEST
presenter. Our speakers have years of industry
experience and are ready to share their insight with
you. Bring your biggest issue, your testing plans, or
whatever’s on your mind. Leave with fresh ideas on
how to approach your testing challenges. You’ll have
the chance to sign up during the conference and get
some free consulting!
Bookstore and Speaker Book
Signings
Tuesday, October 14–Thursday, October 16
Purchase popular industry books—many authored by
STARWEST speakers—from BreakPoint Books. Select
authors are available are available for questions and
book signings during session breaks and Expo hours.
Te st imo n ials
“Overal l, one
fan tast ic exp erie nce
that insp ires
improve men t and
lear ning! I wou ld
recommend to oth ers.”
—Kathlee n Gree nburg,
FirstBank Data Corp.
CALL
888.268.8770
s
“Won de rf ul se nior te ster
gs
to ok me un de r th ei r w in
.
an d enco uraged me to fly
I fe el so op tim ist ic no w !
ive
This was a tran sf or m at
ex pe rie nc e.”
“G oo d ch ance to
ne t wor k an d al so
ve ry op en approach
by th e sp ea ke rs be ing
av ai la ble an d gree ting
th ro ugho ut th e en tire
co nference .”
—Cla ire Moss, Da xk o
, NN
—Morten M ax An dersen
OR
904.278.0524
TO
REGISTER
•
IT
S TA R W E S T. T E C H W E L L . C O M
7
COMBINE IN-DEPTH TRAINING WITH
Combine your conference with in-depth training to enhance your learning
experience. Take advantage of networking, benefit from access to top industry
experts, and mingle with colleagues while you improve your skill set. View full
course descriptions at http://vlt.me/swtraining.
Members of the PMI are
eligible to earn up to 22.5
PDUs for select courses.
Mobile Application Testing
Sunday, October 12–Monday, October 13 • 8:30am–5:00pm
The Mobile Application Testing course will cover usability across multiple platforms and resolutions, network and
security testing, creating application unit tests, mobile UI automation, and performance testing for various devices
over various networks and carriers. A mobile device such as a smartphone or tablet is required.
•
•
•
•
Understand what makes mobile application testing different from standard software testing
Learn some of the underlying technologies behind mobile devices and how those technologies affect testing
Discover how mobile applications work and different techniques for testing them
Explore the different types of mobile applications and how to test for each
Jeff Pierce
Mastering HP LoadRunner® for Performance Testing
Sunday, October 12–Tuesday, October 14 • 8:30am–5:00pm
Mastering HP LoadRunner® for Performance Testing provides students with the knowledge and skills to use the
latest testing tools provided by HP to validate decisions and improve software performance. By the end of the
course, students are equipped to begin planning the implementation of LoadRunner® and Performance Center for
improving testing practices within their organizations.
•
•
•
•
•
Understand performance implications of technologies and protocols in modern data centers
Select scenarios to measure performance and capacity risks organizations face today
Design emulation scripts, scenarios, and reports to expose various risks
Setup controllers, load generators, monitoring, and virtual table servers
Generate and edit TruClient and VuGen scripts to emulate internet browsers and use test data
Wilson Mar
Agile Tester Certification—ICAgile
Sunday, October 12–Monday, October 13 • 8:30am–5:00pm
In Agile Tester Certification—ICAgile, you will learn the fundamentals of agile development, the role of the
tester in the agile team, and the agile testing processes. From user story elicitation and grooming through
development and testing, this course prepares you to be a valuable member of an agile development team.
•
•
•
•
Discover how testing is implemented in different agile environments
Learn about user stories and how to test them
Explore key agile testing practices—ATDD, BDD, TDD, and ET
Recognize the main agile testing challenges and how to address them
Robert Sabourin
Requirements-Based Testing
Monday, October 13–Tuesday, October 14 • 8:30am–5:00pm
Requirements-Based Testing (RBT) delivers a proven, rigorous approach for designing a consistent and repeatable set
of highly optimized test cases. Companies employing RBT practices have achieved twice the requirements coverage
with only half the tests they previously maintained. Explore alternative test design techniques and the advantages and
disadvantages of each. Learn how to complement functional, black-box testing with code-based, white-box testing to
further ensure complete coverage and higher quality. Classroom exercises are employed throughout the workshop to
reinforce your learning. Bring samples from your own projects to work on and evaluate during class.
•
•
•
•
8
Develop and maintain efficient tests that cover all functional requirements
Design test cases that force defects to appear early in testing
Learn and practice cause-effect graphing to design more robust tests
Optimize and reduce the size of your test suite
CALL
888.268.8770
OR
904.278.0524
TO
REGISTER
Richard Bender
•
S TA R W E S T. T E C H W E L L . C O M
YOUR CONFERENCE AND SAVE $300
Fundamentals of Agile Certification—ICAgile
Sunday, October 12–Monday, October 13 • 8:30am–5:00pm
Fundamentals of Agile Certification—ICAgile will present a roadmap for how to get started with agile along with
practical advice. It will introduce you to agile software development concepts and teach you how to make them
work. You will learn what agile is all about, why agile works, and how to effectively plan and develop software
using agile principles. A running case study allows you to apply the techniques you are learning as you go
through the course.
•
•
•
•
•
Explore agile software development methodologies and approaches
Understand differences between agile and traditional methodologies
Learn how agile practices and principles improve the software development process
Discover the major steps required to successfully plan and execute an agile software project
Explore the leading agile development best practices
Jeff Payne
Software Tester Certification—Foundation Level
Sunday, October 12–Tuesday, October 14 • 8:30am–5:00pm
Delivered by top experts in the testing industry, Software Tester Certification—Foundation Level is an accredited training course,
designed to help prepare you for the ISTQB® Certified Tester—Foundation Level exam. This certification program, accredited by the
ISTQB® through its network of National Boards, is the only internationally accepted certification for software testing. The ISTQB®,
a non-proprietary and nonprofit organization, has granted more than 320,000 certifications in over 100 countries around the world.
This course is most appropriate for individuals who recently entered the testing field and those currently seeking ISTQB® certification
in software testing.
• Fundamentals of software testing—key concepts, context, risk, goals, process, and people issues
• Lifecycle testing—relationship of testing to development, including different models, verification and
validation, and types of testing
• Test levels—system, acceptance, unit, and integration testing
• Test design techniques—black-box test methods, white-box testing, and exploratory testing
• Static testing—reviews, inspections, and static analysis tools
• Test management—team organization, key roles and responsibilities, test approach and planning,
configuration management, defect classification and tracking, test reporting
• Testing tools—selection, benefits, risks, and classifications
Claire Lohr
Real-World Software Testing with Microsoft Visual Studio®
Sunday, October 12–Tuesday, October 14 • 8:30am–5:00pm
This course provides students with real world software testing techniques and technical skills using the latest Microsoft Test Manager 2013®,
Visual Studio 2013®, and Team Foundation Server 2013® tools. We will cover manual testing features such as test case management,
execution and reporting, and how Visual Studio® makes these processes powerful and organized. You will learn about the newly released
Visual Studio® Web Test Manager and be introduced to automated testing with Visual Studio®. Discover how to effectively integrate QA with
Team Foundation Server’s requirements, bug tracking, and work and build management capabilities. Increase automation
effectiveness using virtual lab environments.
•
•
•
•
•
•
Increase productivity by planning, executing, and tracking tests using Microsoft Test Manager 2013®
Learn how rich data collectors enhance bug reproducibility
Support agile testing practices with features such as exploratory testing
Increase test coverage with automated testing using Microsoft’s Visual Studio® Coded UI
Collaborate seamlessly with other team members using Team Foundation Server 2013®
Take advantage of the latest Visual Studio 2013® virtualization integration
Anna Russo
For more details on combining training with your conference, contact the Client Support Group at
[email protected] or call 888.268.8770 or 904.278.0524.
CALL
888.268.8770
OR
904.278.0524
TO
REGISTER
•
S TA R W E S T. T E C H W E L L . C O M
9
TUTORIALS
MONDAY, OCTOBER 13, 8:30–4:30 (FULL-DAY)
MA The Challenges of BIG Testing: Automation, Virtualization, Outsourcing, and More
Hans Buwalda, LogiGear
Large-scale and complex testing projects can stress the testing and automation practices we have learned through the years, resulting in
less than optimal outcomes. However, a number of innovative ideas and concepts are emerging to better support industrial-strength testing
for big projects. Hans Buwalda shares his experiences and presents strategies for organizing and managing testing on large projects. Learn
how to design tests specifically for automation, including how to incorporate keyword testing and other techniques. Learn what roles virtualization and
the cloud can play—and the potential pitfalls of such options. Take away tips and tricks to make automation more stable, and to deal with the numerous
versions and configurations common in large projects. Hans also describes the main challenges with global teams including time zones and cultural
differences, and offers seven common problem “patterns” in globalization and what you can do to address them.
MB A Rapid Introduction to Rapid Software Testing
Michael Bolton, DevelopSense
You’re under tight time pressure and have barely enough information to proceed with testing. How do you test quickly and inexpensively,
yet still produce informative, credible, and accountable results? Rapid Software Testing, adopted by context-driven testers worldwide, offers
a field-proven answer to this all-too-common dilemma. In this one-day sampler of the approach, Michael Bolton introduces you to the skills
and practice of Rapid Software Testing through stories, discussions, and “minds-on” exercises that simulate important aspects of real testing problems.
The rapid approach isn’t just testing with speed or a sense of urgency; it’s mission-focused testing that eliminates unnecessary work, assures that the most
important things get done, and constantly asks how testers can help speed up the successful completion of the project. Join Michael to learn how Rapid
Testing focuses on both the mind set and skill set of the individual tester, using tight loops of exploration and critical thinking skills to help continuously
re-optimize testing to match clients’ needs and expectations. Participants are strongly encouraged to bring a Windows-compatible computer to the class.
MC Getting Started with Risk-Based Testing
Dale Perry, Software Quality Engineering
Whether you are new to testing or looking for a better way to organize your test practices, understanding risk is essential to successful
testing. Dale Perry describes a general risk-based framework—applicable to any development lifecycle model—to help you make critical
testing decisions earlier and with more confidence. Learn how to focus your testing effort, what elements to test, and how to organize test
designs and documentation. Review the fundamentals of risk identification, analysis, and the role testing plays in risk mitigation. Develop an inventory of
test objectives to help prioritize your testing and translate them into a concrete strategy for creating tests. Focus your tests on the areas essential to your
stakeholders. Execution and assessing test results provide a better understanding of both the effectiveness of your testing and the potential for failure in
your software. Take back a proven approach to organize your testing efforts and new ways to add more value to your project and organization.
MD Introduction to Selenium and WebDriver
NEW
Alan Richardson, Compendium Developments
Selenium is an open source automation tool for test driving browser-based applications. WebDriver, the newly-introduced API for Selenium
against which tests are written in Java, contains classes including ChromeDriver, AndroidDriver, and iPhoneDriver. Sometimes test authors
find the API daunting and their initial automation code brittle and poorly structured. In this introduction, Alan Richardson provides hints and
tips gained from his years of experience both using WebDriver and helping others improve their use of the tool. Alan starts at the beginning, explaining
the basic WebDriver API capabilities—simple interrogation and navigation—and then moves on to synchronization strategies and working with AJAX
applications. He covers tools and location strategies to find elements on web pages using CSS and XPath. Alan provides an introduction to abstraction
approaches which help you build robust, reliable, and maintainable automation suites.
LAPTOP
REQUIRED
Hands-on exercises require a laptop computer with Firefox, Firebug, and Firepath installed. You will write code! Coding exercises require an IDE (IntelliJ),
Java SDK, and Maven. Prior to the session, follow the Getting Started Guide at http://seleniumsimplified.com/get-started. Come ready to learn.
VE
COMBINE AN00DwhSenAyo
u
l $3
Save an additiona
multi- day training
attend any of the
nference.
classes and the co
g)
conference pricin
reflected in the
(Disco unt already
re details.
See page 8 for mo
10
CALL
888.268.8770
OR
904.278.0524
TO
REGISTER
•
S TA R W E S T. T E C H W E L L . C O M
TUTORIALS
MONDAY, OCTOBER 13, 8:30–12:00 (HALF-DAY – MORNING)
ME What’s Your Leadership IQ?
NEW
Jennifer Bonine, tap|QA, Inc.
Have you ever needed a way to measure your leadership IQ? Or been in a performance review where the majority of time was spent
discussing your need to improve as a leader? If you have ever wondered what your core leadership competencies are and how to build on
and improve them, Jennifer Bonine shares a toolkit to help you do just that. This toolkit includes a personal assessment of your leadership
competencies, explores a set of eight dimensions of successful leaders, provides suggestions on how you can improve competencies that are not in your
core set of strengths, and describes techniques for leveraging and building on your strengths. These tools can help you become a more effective and
valued leader in your organization. Exercises help you gain an understanding of yourself and strive for balanced leadership through recognition of both
your strengths and your “development opportunities.”
MF Application Performance Testing: A Simplified Universal Approach
Scott Barber, SmartBear
In response to increasing market demand for high performance applications, many organizations implement performance testing projects,
often at great expense. Sadly, these solutions alone are often insufficient to keep pace with emerging expectations and competitive
pressures. With specific examples from recent client implementations, Scott Barber shares the fundamentals of implementing T4APM™,
a simple and universal approach that is valuable independently or as an extension of existing performance testing programs. The T4APM™ approach
hinges on applying a simple and unobtrusive Target, Test, Trend, Tune cycle to tasks in your application lifecycle—from a single unit test through entire
system production monitoring. Leveraging T4APM™ on a particular task may require knowledge specific to the task, but learning how to leverage the
approach does not. Scott provides everything you need to become the T4APM™ coach and champion, and to help your team keep up with increasing
demand for better performance, regardless of your current title or role.
MG Measurement and Metrics for Test Managers
Rick Craig, Software Quality Engineering
To be most effective, test managers must develop and use metrics to help direct the testing effort and make informed recommendations
about the software’s release readiness and associated risks. Because one important testing activity is to “measure” the quality of the
software, test managers must measure the results of both the development and testing processes. Collecting, analyzing, and using metrics
are complicated because many developers and testers are concerned that the metrics will be used against them. Join Rick Craig as he addresses common
metrics—measures of product quality, defect removal efficiency, defect density, defect arrival rate, and testing status. Learn the guidelines for developing
a test measurement program, rules of thumb for collecting data, and ways to avoid “metrics dysfunction.” Rick identifies several metrics paradigms and
discusses the pros and cons of each. Delegates are urged to bring their metrics problems and issues for use as discussion points.
MH Take a Test Drive: Acceptance Test-Driven Development
NEW
Jared Richardson, Agile Artisans
The practice of agile software development requires a clear understanding of business needs. Misunderstanding requirements causes
waste, slipped schedules, and mistrust within the organization. Jared Richardson shows how good acceptance tests can reduce
misunderstanding of requirements. A testable requirement provides a single source that serves as the analysis document, acceptance
criteria, regression test suite, and progress-tracker for any given feature. Jared explores the creation, evaluation, and use of testable requirements by the
business and developers. Learn how to transform requirements into stories—small units of work—that have business value, small implementation effort,
and easy to understand acceptance tests. This tutorial features an interactive exercise that starts with a high level feature, decomposes it into stories,
applies acceptance tests to those stories, and estimates the stories for business value and implementation effort. The exercise demonstrates how big
requirement stories can be decomposed into business-facing stories, rather than into technical tasks that the business does not understood.
MI A Dozen Keys to Agile Testing Maturity
Bob Galen, Velocity Partners, and Mary Thorn, ChannelAdvisor
You’ve “gone agile” and have been relatively successful. So, how do you know how well your team is really doing? And how
do you continuously improve your practices? When things get rocky, how do you handle the challenges without reverting
to old habits? You realize that the path to high-performance agile testing isn’t easy or quick. It also helps to have a guide.
So consider this workshop your guide to ongoing, improved, and sustained high-performance. Join Bob Galen and Mary Thorn as they share lessons
from their most successful agile testing transitions. Explore actual team case studies for building team skills, embracing agile requirements, fostering
customer interaction, building agile automation, driving business value, and testing at-scale—all building agile testing excellence. Examine the mistakes,
adjustments, and the successes, and learn how to react to real-world contexts. Leave with a better view of your team’s strengths, weaknesses, and where
you need to focus to improve.
MJ Applying Emotional Intelligence to Testing
NEW
Thomas McCoy, Australian Department of Social Services
As test managers and test professionals we can have an enormous emotional impact on others. We’re constantly dealing with fragile egos,
highly charged situations, and pressured people playing a high-stakes game under conditions of massive uncertainty. We’re often the
bearers of bad news and are sometimes perceived as critics, activating people’s primal fear of being judged. Emotional intelligence (EI),
the concept popularized by Harvard psychologist and science writer Daniel Goleman, has much to offer test managers and testers. Key EI skills include
self-awareness, self-management, social awareness, and relationship management. Explore the concept of EI, assess your own levels of EI, and look at
ways in which EI can help. Thomas McCoy discusses how EI can be useful in dealing with anger management, controlling negative thoughts, processing
constructive criticism, and dealing with conflict—all within the context of the testing profession. This lively session is grounded in real-life examples, giving
you concrete ideas to take back to work.
CALL
888.268.8770
OR
904.278.0524
TO
REGISTER
•
S TA R W E S T. T E C H W E L L . C O M
11
TUTORIALS
MONDAY, OCTOBER 13, 1:00–4:30 (HALF-DAY – AFTERNOON)
MK Being Creative: A Visual Testing Workshop
NEW
Andy Glover, Exco InTouch
The reality is that technology is complicated. As testers, we are challenged with complicated problems that need solving. Andy Glover
presents a hands-on workshop that describes a new way of looking at testing problems and ideas. Andy demonstrates how thinking with
pictures can help testers discover and develop new ideas, solve problems in unexpected ways, and dramatically improve their ability
to share their insights with others. Andy shows how to clarify a problem or sell an idea by visually breaking it down using a set of visualization tools
including mind maps, work flows, and powerful but simple visuals to communicate complex messages. You don’t need to know how to draw to attend
this workshop. Although we are naturally creative, sometimes it takes effort to develop those skills. Join Andy to learn how to apply visual solutions to our
everyday software testing challenges.
ML Innovation Thinking: Evolve and Expand Your Capabilities
Jennifer Bonine, tap|QA, Inc.
Innovation is a word frequently tossed around in organizations today. The standard clichés are do more with less and be creative.
Companies want to be innovative but often struggle with how to define, implement, prioritize, and track their innovation efforts. Using the
Innovation to Types model, Jennifer Bonine will help you transform your thinking regarding innovation and understand if your team and
company goals match their innovation efforts. Learn how to classify your activities as “core” (to the business) or “context” (essential, but non-revenue
generating). Once you understand how your innovation activities are related to revenue generating activities, you can better decide how much of your
effort should be spent on core or context activities. Take away tools including an Innovation to Types model for classifying innovation, a Core and Context
model to classify your activities, and a way to map your innovation initiatives to different contexts.
MM Satisfying Auditors: Plans and Evidence in a Regulated Environment
NEW
James Christie, Claro Testing
Testers want to be responsible and professional. However, they often come under pressure to comply with rules, standards, and processes
that aren’t always helpful. It’s the price of keeping your auditors happy. But do you really know what auditors want? Are they all simply
rule-obsessed, pedantic “little dictators”? James Christie shows why good auditors worry about risk—not rules. They want to explain the
important risks to the people who lose sleep over them. James explains auditors’ and regulators’ attitudes toward risk and evidence. He shows that
auditors’ standards and governance models do have useful advice—knowledge that can help you choose the right testing approach for your project.
James shows how to enlist smart auditors as valuable allies—and how to challenge the poor ones. Understanding auditors’ needs will help you do better
testing, at less cost. Wouldn’t senior management and your stakeholders be interested in that? MN Test Automation Patterns: Issues and Solutions
Seretta Gamba, Steria Mummert ISS GmbH, and Mark Fewster, Grove Consultants
Automating system level test execution can result in many problems. It is surprising to find that many people encounter
the same problems yet are unaware of common solutions that worked well for others. These problem/solution pairs are
called “patterns.” Seretta Gamba recognized the commonality of these test automation issues and their solutions and,
together with Mark Fewster, has organized them into Test Automation Patterns. Although unit test patterns are well known, Seretta’s and Mark’s patterns
address more general issues. They cover management, process, design, and execution patterns to help you recognize common test automation issues
and show you how to identify appropriate patterns to solve the problems. Issues such as No Previous Automation, High ROI Expectations, and High Test
Maintenance Cost are addressed by patterns such as Maintainable Testware, Tool Independence, and Management Support.
LAPTOP
REQUIRED
Bring your laptop to gain access to an offline version of the wiki during the tutorial.
MO Exploratory Testing Explained
Paul Holland, Doran Jones, Inc.
Exploratory testing is an approach to testing that emphasizes the freedom and responsibility of testers to continually optimize the value
of their work. Exploratory testing is the process of three mutually supportive activities—learning, test design, and test execution—done in
parallel. With skill and practice, exploratory testers typically uncover an order of magnitude more problems than when the same amount of
effort is spent on procedurally-scripted testing. All testers conduct exploratory testing in one way or another, but few know how to do it systematically to
obtain the greatest benefits. Even fewer can articulate the process. Paul Holland shares specific heuristics and techniques of exploratory testing that will
help you get the most from this highly productive approach. Paul focuses on the skills and dynamics of exploratory testing, and how it can be combined
with scripted approaches.
MP Jon Bach: On Testing
NEW
Jon Bach, eBay, Inc.
Jon Bach has been in testing for twenty years. Is testing in 2014 different from testing in 1994? If you’ve been in the business that long,
maybe you’ve seen it move from a little bit of automation and tooling to almost total automation and tooling. Maybe you’ve seen lab setups
go from hours of loading OS images on “boat anchor” boxes to virtual, on-demand, scalable cloud provisioning in seconds. Maybe you
think testing is dead because we live in a DevOps world where it’s good enough to run a happy path checklist. Maybe you think testing isn’t dead because
you’ve seen recent computer science graduates know dangerously nothing about the craft of sapient testing. Jon wants to know what you’ve seen in your
careers. In exchange for his thoughts, he wants to hear yours: How do you define testing? What are your best ideas? What works for you? Under what
contexts does testing not matter at all? Whether you’re stuck or confused, inspired or hopeful, come, listen, and contribute your experiences…On Testing.
12
CALL
888.268.8770
OR
904.278.0524
TO
REGISTER
•
S TA R W E S T. T E C H W E L L . C O M
TUTORIALS
TUESDAY, OCTOBER 14, 8:30–4:30 (FULL-DAY)
TA Critical Thinking for Software Testers
Michael Bolton, DevelopSense
Critical thinking is the kind of thinking that specifically looks for problems and mistakes. Regular people don’t do a lot of it. However, if you
want to be a great tester, you need to be a great critical thinker. Critically thinking testers save projects from dangerous assumptions and
ultimately from disasters. The good news is that critical thinking is not just innate intelligence or a talent—it’s a learnable and improvable
skill you can master. Michael Bolton shares the specific techniques and heuristics of critical thinking and presents realistic testing puzzles that help you
practice and increase your thinking skills. Critical thinking begins with just three questions—Huh? Really? and So?—that kick start your brain to analyze
specifications, risks, causes, effects, project plans, and anything else that puzzles you. Join Michael for this interactive, hands-on session and practice your
critical thinking skills. Study and analyze product behaviors and experience new ways to identify, isolate, and characterize bugs. Participants are strongly
encouraged to bring a Windows-compatible computer to the class.
TB Successful Test Automation: A Manager’s View
Mark Fewster, Grove Consultants
Many organizations invest substantial time and effort in test automation but do not achieve the significant returns they expected. Some
blame the tool they used; others conclude test automation just doesn’t work in their situation. The truth, however, is often very different.
These organizations are typically doing many of the right things but they are not addressing key issues that are vital to long term test
automation success. Describing the most important issues that you must address, Mark Fewster helps you understand and choose the best approaches
for your organization—no matter which automation tools you use. We’ll discuss both management issues—responsibilities, automation objectives, and
return on investment—and technical issues—testware architecture, pre- and post-processing, and automated comparison techniques. If you are involved
with managing test automation and need to understand the key issues in making test automation successful, join Mark for this enlightening tutorial.
THE RIGHT FORMULA FOR GROUPS
Bring your team and save up to 30%
on each registration!
See page 31 for more details.
CALL
888.268.8770
OR
904.278.0524
TO
REGISTER
•
S TA R W E S T. T E C H W E L L . C O M
13
TUTORIALS
TUESDAY, OCTOBER 14, 8:30–12:00 (HALF-DAY – MORNING)
TC Fundamental Test Design Techniques
NEW
Lee Copeland, Software Quality Engineering
As testers, we know that we can define many more test cases than we will ever have time to design, execute, and report. The key problem in testing is choosing
a small, “smart” subset from the almost infinite number of tests available that will find a large percentage of the defects. Join Lee Copeland to discover how to
design test cases using formal black-box techniques, including equivalence class testing, boundary value testing, decision tables, and state-transition diagrams.
Explore examples of each of these techniques in action. Don’t just pick test cases at random. Rather, learn to selectively choose a set of test cases that maximizes
your effectiveness and efficiency to find more defects in less time. Then, learn how to use the test results to evaluate the quality of both your products and your testing. Discover
the test design techniques that will make your testing more productive.
TD Exploratory Testing Is Now in Session
Jon Bach, eBay, Inc.
The nature of exploration, coupled with the ability of testers to rapidly apply their skills and experience, make exploratory testing a widely used test approach—
especially when time is short. Unfortunately, exploratory testing often is dismissed by project managers who assume that it is not reproducible, measurable, or
accountable. If you have these concerns, you may find a solution in a technique called session-based test management (SBTM), developed by Jon Bach and his
brother James to specifically address these issues. In SBTM, testers are assigned areas of a product to explore, and testing is time boxed in “sessions” that have
mission statements called “charters” to create a meaningful and countable unit of work. Jon discusses—and you practice—the skills of exploration using the SBTM approach. He
demonstrates a freely available, open source tool to help manage your exploration and prepares you to implement SBTM in your test organization.
TE Integrating Automated Testing into DevOps
NEW
Jeff Payne, Coveros
In many organizations, agile development processes are driving the pursuit of faster software releases, which has spawned a set of new practices called
DevOps. DevOps stresses communications and integration between development and operations, including continuous integration, continuous delivery, and
rapid deployments. Because DevOps practices require confidence that changes made to the code base will function as expected. automated testing is an
essential ingredient Join Jeff Payne as he discusses the unique challenges associated with integrating automated testing into continuous integration/continuous
delivery (CI/CD) environments. Learn the internals of how CI/CD works, appropriate tooling, and test integration points. Find out howpto integrate your existing test automation
frameworks into a DevOps environment and leave with roadmap for integrating test automation with continuous integration and delivery.
TF Essential Test Management and Planning
Rick Craig Software Quality Engineering
The key to successful testing is effective and timely planning. Rick Craig introduces proven test planning methods and techniques, including the Master Test Plan
and level-specific test plans for acceptance, system, integration, and unit testing. Rick explains how to customize an IEEE-829-style test plan and test summary
report to fit your organization’s needs. Learn how to manage test activities, estimate test efforts, and achieve buy-in. Discover a practical risk analysis technique
to prioritize your testing and become more effective with limited resources. Rick offers test measurement and reporting recommendations for monitoring the
testing process. Discover new methods and develop renewed energy for taking your organization’s test management to the next level.
TG Testing Cloud Services
NEW
Martin Pol and Jeroen Mengerink, Polteq Test Services B.V.
Cloud computing is rapidly changing the way systems are developed, tested, and deployed. New system hosting capabilities—software as a
service (SaaS), platform as a service (PaaS), infrastructure as a service (IaaS)—are forcing us to review and revise our testing processes. At the
same time, cloud computing is affording us opportunities to employ new test tooling solutions, which we call testing as a service (TaaS). In this
technical session, Martin Pol and Jeroen Mengerink focus on testing SaaS systems, describing relevant IaaS and PaaS capabilities along the way.
They discuss how to test performance of the cloud itself and ways to take advantage of the resource elasticity afforded by cloud computing. Martin and Jeroen explore the risks―
some traditional, others completely new—that arise when organizations implement cloud computing and describe the tests you need to design to mitigate these risks.
Delegates will receive a free copy of the book Testing Cloud Services by Kees Blokland, Jeroen Mengerink, and Martin Pol.
TH Planning, Architecting, Implementing, and Measuring Automation
NEW
Mike Sowers, Software Quality Engineering
In automation, we often use several different tools that are not well integrated. These tools have been developed or acquired over time with little consideration
of an overall plan or architecture and without considering the need for integration. As a result, both efficiency and effectiveness suffer, and additional time and
money are spent. Ensuring that tools we currently have, or the tools we develop or acquire in the future, work well with other application lifecycle tools is critical
to our testing team’s success. We must drive the adoption of automation across multiple project teams and departments and communicate the benefits to our
stakeholders. Join Mike Sowers as he shares his experiences in creating an automation plan, developing an automation architecture, and establishing tool metrics in multiple
organizations. Mike will discuss both the good (engaging the technical architecture team) and bad (too much isolation between test automators and test designers) on his
automation journey in a large enterprise.
TI Test Management for Large, Multi-Project Programs
NEW
Geoff Horne, NZTester magazine
Running a test project can be a challenge. Running a number of test projects as part of a portfolio can be even more challenging. However, most challenging of
all can be running a group of projects in which every project needs to merge at a single end point. Geoff Horne considers: How does a program test manager
(PTM) slice up the testing work packages and then group them by “like” types into discrete projects? How does the PTM determine the best approach for each
project while maintaining the most advantageous approach for the overall program? How does each project fit into the overall test strategy? These and other
questions are the everyday challenges of the PTM. Maintaining forward momentum at the required rate across many different tracks, all heading for a single end point, requires
skill and experience at many levels. Join Geoff to learn how to qualify, quantify, and effectively run any size test program like a well-oiled machine.
TJ Exploring Usability Testing for Mobile and Web Technologies
Rob Sabourin, AmiBug.com, Inc.
It’s not enough to verify that software conforms to requirements by passing established acceptance tests. Successful software products engage, entertain,
and support the users’ experience. Goals vary from project to project, but no matter how robust and reliable your software is, if your users do not embrace it,
business can slip from your hands. Rob Sabourin shares how to elicit effective usability requirements with techniques such as storyboarding and task analysis.
Together, testers, programmers, and users collaborate to blend the requirements, design, and test cycles into a tight feedback loop. Learn how to select a subset
of system functions to test with a small group of users to get high value information at low cost. Learn how usability testers can take advantage of naïve questions from novice
users as well as the tunnel vision and bias of domain experts. Rob shares examples of usability testing for a variety of technologies including mobile and web-based products.
14
CALL
888.268.8770
OR
904.278.0524
TO
REGISTER
•
S TA R W E S T. T E C H W E L L . C O M
TUTORIALS
TUESDAY, OCTOBER 14, 1:00–4:30 (HALF-DAY – AFTERNOON)
TK Pairwise Testing Explained
NEW
Lee Copeland, Software Quality Engineering
Many software systems are required to process huge combinations of input data, all of which deserve to be tested. Since we rarely have time to create and
execute test cases for all combinations, the fundamental problem in testing is how to choose a reasonably-sized subset that will find a large percentage of the
defects and can be performed within the limited time and budget available. Pairwise testing, the most effective test design technique to deal with this problem,
is unfortunately not understood by many testers. The answer is not to attempt to test all combinations of all values for all input variables but to test all pairs of
variables. This significantly reduces the number of tests that must be created and run but still finds a large percentage of defects. With examples of the effectiveness of pairwise
testing, Lee Copeland demonstrates this technique through the use of orthogonal arrays, James Bach’s all-pairs algorithm, and Microsoft’s PICT tool. Learn to apply the pairwise
testing technique as you work through a number of hands-on exercises.
TL Security Testing for Test Professionals
Jeff Payne, Coveros, Inc.
Today’s software applications are often security critical, making security testing an essential part of a software quality program. Unfortunately, most testers have
not been taught how to effectively test the security of the software applications they validate. Join Jeff Payne as he shares what you need to know to integrate
effective security testing into your everyday software testing activities. Learn how software vulnerabilities are introduced into code and exploited by hackers.
Discover how to define and validate security requirements. Explore effective test techniques for assuring that common security features are tested. Learn
about the most common security vulnerabilities and how to identify key security risks within applications and to use testing to mitigate them. Understand how to security test
applications—both web- and GUI-based—during the software development process. Review examples of how common security testing tools work and assist the security testing
process. Take home valuable tools and techniques for effectively testing the security of your applications going forward.
TM End-to-End Testing with the Heuristic Software Test Model
NEW
Paul Holland, Doran Jones, Inc.
You have just been assigned a new testing project. Where do you start? How do you develop a plan and begin testing? How will you report on your progress?
Paul Holland shares new test project approaches that enable you to plan, test, and report effectively. Paul demonstrates ideas, based on the Heuristic Software Test
Model from Rapid Software Testing, that can be directly applied or adapted to your environment. In this hands-on tutorial, you’ll be given a product to test. Start by
creating three raw lists (Product Coverage Outline, Potential Risks, and Test Ideas) that help ensure comprehensive testing. Use these lists to create an initial set of
test charters. We employ “advanced” test management tools (Excel and whiteboards with Sticky Notes) to create useful test reports without using “bad metrics” (counts of pass/fail
test cases, % of test cases executed vs. plan). Look forward to your next testing project with these new ideas and your improved planning, testing, and reporting skills.
TN Testing the Data Warehouse—Big Data, Big Problems
Geoff Horne, NZTester magazine
Data warehouses have become a popular mechanism for collecting, organizing, and making information readily available for strategic decision making. The
ability to review historical trends and monitor near real-time operational data has become a key competitive advantage for many organizations. Yet the methods
for assuring the quality of these valuable assets are quite different from those of transactional systems. Ensuring that the appropriate testing is performed is a
major challenge for many enterprises. Geoff Horne has led a number of data warehouse testing projects in both the telecommunications and ERP sectors. Join
Geoff as he shares his approaches and experiences, focusing on the key “uniques” of data warehouse testing including methods for assuring data completeness, monitoring
data transformations, and measuring quality. He also explores the opportunities for test automation as part of the data warehouse process, describing how it can be harnessed to
streamline and minimize overhead.
TO Getting Your Message Across: Communication Skills for Testers
NEW
Thomas McCoy, Australian Department of Social Services
Communication is at the heart of our profession. No matter how advanced our testing capabilities are, if we can’t convey our concerns in ways that connect with key
members of the project team, our contribution is likely to be ignored. Because we act solely in an advisory capacity, rather than being in command, our power to
exert influence is almost entirely based on our communication skills. With people suffering information overload and deluged with emails, it is more important than
ever that we craft succinct and effective messages, using a range of communication modalities. Join Thomas McCoy as he draws on techniques from journalism,
public relations, professional writing, psychology, and marketing to help you get your message across. Key themes include: non-verbal communication, presentation skills, persuasive
writing, influencing skills, graphic communication, and communicating in teams and meetings. We will use a range of hands-on exercises to practice the concepts being discussed.
TP Introducing Keyword-Driven Test Automation
Hans Buwalda, LogiGear
In both agile and traditional projects, keyword-driven testing—when done correctly—has proven to be a powerful way to attain a high level of automation.
Many testing organizations use keyword-driven testing but aren’t realizing the full benefits of scalability and maintainability that are essential to keep up with
the demands of testing today’s software. Hans Buwalda describes the keyword approach, and how you use it to can meet the very aggressive goal that he calls
the “5 percent challenge”—automate 95 percent of your tests with no more than 5 percent of your total testing effort. Hans also discusses how the keyword
approach relates to other automation techniques like scripting and data-driven testing, and the ways keywords can be used for specific situations like graphics, multimedia, and
mobile. Use the information and the real-world examples that Hans presents to attain a very high level of maintainable automation with the lowest possible effort.
TQ Test Estimation in Practice
Rob Sabourin AmiBug.com
Anyone who has ever attempted to estimate software testing effort realizes just how difficult the task can be. The number of factors that can affect the estimate
is virtually unlimited. The key to good estimates is to understand the primary variables, compare them to known standards, and normalize the estimates based
on their differences. This is easy to say but difficult to accomplish because estimates are frequently required even when very little is known about the project and
what is known is constantly changing. Throw in a healthy dose of politics and a bit of wishful thinking and estimation can become a nightmare. Rob Sabourin
provides a foundation for anyone who must estimate software testing work effort. Learn about the test team’s and tester’s roles in estimation and measurement, and how to
estimate in the face of uncertainty. Analysts, developers, leads, test managers, testers, and QA personnel can all benefit from this tutorial.
TR Test Automation Strategies for the Agile World
NEW
Bob Galen, Velocity Partners
With the adoption of agile practices in many organizations, the test automation landscape has changed. Bob Galen explores current disruptors to traditional
automation strategies, and discusses relevant and current adjustments you need to make when developing your automation business case. Open source tools
are becoming incredibly viable and beat their commercial equivalents in many ways—not only in cost, but also in functionality, creativity, evolutionary speed,
and developer acceptance. Agile methods have fundamentally challenged our traditional automation strategies. Now we must keep up with incremental and
emergent systems and architectures and their high rates of change. Bob explores new automation strategies, examining strategies for both greenfield applications and those
pesky legacy projects. Learn how to wrap a business case and communication plan around them so you get the support you need. Leave the workshop with a serious game-plan
for delivering on the promise of agile test automation.
CALL
888.268.8770
OR
904.278.0524
TO
REGISTER
•
S TA R W E S T. T E C H W E L L . C O M
15
KEYNOTES
Testing
Experts
Share
Insight
WEDNESDAY, OCTOBER 15
WEDNESDAY, OCTOBER 15
8:30am Quality
Principles for Today’s
“Glueware”—Testing
Web Services, Libraries,
and Frameworks
10:00am Balancing the
Crusty and Old with
the Shiny and New
Bob Galen, Velocity Partners
Julie Gardiner, Redmind
In his journeys, Bob Galen has
discovered that testing takes on many
forms. Some organizations have no
automated tests and struggle to run
massive manual regression tests within
very short iterative releases. Other
organizations are going “all in”—
writing thousands of acceptance tests
in Gherkin. The resulting imbalance in
their testing approaches undermines an
organization’s efficiency, effectiveness,
and delivery nimbleness. Bob shares
ideas to bring balance to testing.
He explores the choices: manual vs.
automated testing, designed and
scripted test cases vs. exploratory tests,
and thoroughly planned test projects
vs. highly iterative reactive ones. Bob
describes how to balance traditional
test leadership with an iterative and
whole team view to add value. And
finally, he explores the balance of the
gatekeeper vs. leading the collaboration
with stakeholders to find the right
requirements that solve their problems.
Take away a strategic approach to
structure your testing and a renewed
understanding of how testing fits into a
healthy and balanced culture.
In the past, developers knew every
line of code in their applications. They
designed it, wrote it, tested it, and
controlled it. Today’s applications are
far different. Rather than written, they
are often assembled—from program
language libraries, third-party frameworks,
encapsulated web services, and even
entire external systems—and glued
together with small amounts of code.
Before your organization committed to
using these external pieces of software,
were testers part of the evaluation
process? Was the software thoroughly
tested before betting your organization’s
success on it? Or did everyone just
hope for the best? Julie Gardiner
explains how to make the business case
for including test professionals in the
software evaluation to add their unique
focus on software quality. If you’re already
committed to using vendor supplied
software, Julie describes how to ensure
quality from your vendors, on a schedule
that meets your needs—not theirs.
Break Software
at the
STARWEST
Test Lab!
Compete with your fellow
testers to find bugs. Come
on down and practice your
skills and techniques with
conference speakers on
Wednesday, October 15
and Thursday, October 16.
A principal consultant
and head of QA and
agile for Redmind,
Julie Gardiner
provides consultancy
and training in all
aspects of testing,
management, and
agile, specializing in risk, agile, test
management, and usability. Julie has more
than twenty years of experience as
developer, DBA, project manager, head of
operations management, and head of
R&D. She has held positions of test
analyst, test team leader, test consultant,
and test manager in industries including
financial, broadcasting, insurance, utilities,
retail, web, telecoms and the public sector
using approaches from traditional to agile
methodologies. Julie is a certified
ScrumMaster and agile coach.
“A few
of the
keynotes
gave
me aha!
moments.”
— Elaine Soat,
QA/QC Manager,
An agile methodologist,
practitioner, and coach
based in Cary, NC, Bob
Galen helps guide
companies in their
adoption of Scrum and
other agile
methodologies and
practices. Bob is a principal agile evangelist
at Velocity Partners, a leading agile
nearshore development partner; president
of RGCG; and frequent speaker on software
development, project management,
software testing, and team leadership at
conferences and professional groups. He is
a Certified Scrum Coach, Certified Scrum
Product Owner, and an active member of
the Agile and Scrum Alliances. In 2013 Bob
published Scrum Product Ownership—
Balancing Value from the Inside Out.
Reach him at [email protected].
Cartegraph Systems
16
CALL
888.268.8770
OR
904.278.0524
TO
REGISTER
•
S TA R W E S T. T E C H W E L L . C O M
WEDNESDAY, OCTOBER 15
4:30pm Lightning
Strikes the Keynotes
Lee Copeland, Software Quality
Engineering
THURSDAY, OCTOBER 16
8:30am Softwarts:
Security Testing for
Muggles
THURSDAY, OCTOBER 16
4:15pm Why Testers
Need to Code:
Facebook’s No Testing
Department Approach
Paco Hope, Cigital
Simon Stewart, Facebook
Throughout the years,
Lightning Talks have
been a popular part
of the STAR
conferences. If you’re
not familiar with the
concept, Lightning
Talks consists of a
series of five-minute talks by different
speakers within one presentation
period. Lightning Talks are the
opportunity for speakers to deliver their
single biggest bang-for-the-buck idea in
a rapid-fire presentation. And now,
lightning has struck the STAR keynotes.
Some of the best-known experts in
testing—Mark Fewster, Tom McCoy,
Rob Sabourin, Geoff Horne, Johanna
Rothman, Jon Bach, Michael Bolton,
Andy Glover, Jared Richardson, Mary
Thorn—will step up to the podium and
give you their best shot of lightning.
Get eleven keynote presentations for
the price of one—and have some fun at
the same time.
PRESENTERS
Mark
Fewster
Tom
McCoy
Rob
Sabourin
Geoff
Horne
Johanna
Rothman
Jon Bach
Michael
Bolton
Andy
Glover
Jared
Richardson
Mary
Thorn
CALL
888.268.8770
OR
Security testing is often shrouded in
jargon and mystique. Security conjurers
perform arcane rites using supposed
“black hat” techniques and would have
us believe that we cannot do the same.
The fact is that security testing “magic”
is little more than specialized application
of exploratory test techniques we
already understand. In this Defense
against the Black Hats, Paco Hope
dispels the myth that security testing is
a magical art. By deconstructing security
activities into techniques we already
know well, we expand our testing.
Security tests can be seamlessly woven
into our existing test practices with just
a bit of straightforward effort. Glittering
gold security bugs can be tracked and
managed right alongside the mundane,
garden-variety functional ones. The
knowledge that we need to do
meaningful security testing is accessible
and can be learned. If you can test
functionality, you can test security. When
our day-to-day tests include security
too, our software does not fall prey
to the hackers’ sleight-of-hand and
conjurers’ tricks.
A principal consultant
for Cigital, Paco Hope
has deep experience in
securing software and
systems. Paco’s
experience covers web
applications, online
gaming, embedded
devices, lotteries, and business-to-business
transaction systems. He has worked with
small startups and large enterprises in
architecture risk analysis, secure code
review, penetration testing, and other
consulting. Acting president of the London
Chapter of (ISC)², Paco serves on (ISC)²’s
Application Security Advisory Board,
authoring questions for the CISSP and
CSSLP certifications. He coauthored the
Web Security Testing Cookbook, Mastering
FreeBSD and OpenBSD Security, and a
chapter of Building Security In.
904.278.0524
TO
REGISTER
•
Software development release cycles
are being compressed. The luxury of
time is being taken from us, and the
stakes of a botched release are higher
than ever. Facebook now releases a new
stable build of their flagship mobile app
approximately once a month, with alpha
builds of the Android app going out to
the public five times a week. The cost
to the users’ experience with Facebook
(and the company’s reputation!) of a
botched release is astronomical. Simon
Stewart describes the changing face
of software development, examines
current trends, and discusses where
software testing and testers fit into
this brave new world. He explains why
Facebook has no testing department,
why he believes this will become more
widespread, why testers who are able to
code are important to the future of the
industry, and why being able to code
offers you the ability to be an order of
magnitude more effective.
A software engineer at
Facebook, Simon
Stewart helps build
the tooling for testing
their mobile
applications. Simon is
the current lead of the
Selenium project, the
creator of Selenium WebDriver, co-editor
of the W3C WebDriver specification,
Facebook’s W3C AC representative, and
contributor to Selendroid, a mobile
implementation of the WebDriver protocol
targeting Android. As his experience and
current work suggest, Simon is keen on
automated testing and views it as
instrumental to allowing software
development with today’s compressed
release cycles. The way Simon sees it,
testers can spend time doing things more
creative than walking through tedious
checklists of details. Simon tweets at
@shs96c.
S TA R W E S T. T E C H W E L L . C O M
17
C O NCU R R E NT S E SSIO N S
WEDNESDAY, OCTOBER 15, 11:30am
W1
TEST MANAGEMENT
W4
AGILE TESTING
Building Quality In: Adopting the Tester’s
Mindset
A Tester’s Guide to Collaborating with
Product Owners
Stephen Vance, Stellar Advances
Bob Galen, Velocity Partners
When trying to improve the development process, agile and lean
transformations often start by focusing on engineering. Product
management and development get a lot of attention; however, tester
is not one of the defined Scrum roles. Despite the attention given to
automated tests in agile, many transformations seem lost in knowing
how to engage testers—and testers struggle to find their place.
But the tester’s mindset—applying an investigatory and explorative
perspective for empirical and hypothesis-driven improvement—is
essential to agile transformation at all organizational levels, from
the individual team to the board room. Stephen Vance shows how
applying the tester’s mindset at the beginning of development more
effectively supports efforts to build quality in—rather than detecting
problems after they occur. Learn how to flip your thinking by applying
the tester’s mindset to drive change, incorporating ideas from
software craftsmanship, systems thinking, lean manufacturing, lean
startup, Net Promoter System, and more.
The role of the Product Owner in Scrum is only vaguely defined—
owning the Product Backlog and representing the “customer.” In
many organizations, Product Owners go it alone, trying their best
to represent business needs to their teams. What’s often missing
is a collaborative connection between the team’s testers and the
Product Owner—a connection in which testers help to define and
refine requirements, broaden the testing landscape and align it to
customer needs, provide a conduit for collaboration between the
customer and the team, assure that the team is building the right
thing, and help demonstrate complete features. This relationship is
central to the team and facilitates transparency to help gain feedback
from the entire organization. Join seasoned agile coach Bob Galen
as he shares techniques for doing just this. Return with new ideas
and techniques for helping your Product Owner and team deliver
better received and higher value products—not just by testing but by
fostering collaboration.
W2
TEST TECHNIQUES
W5
PERSONAL EXCELLENCE
Testing Lessons Learned from Sesame Street
Growing into Leadership
Rob Sabourin, AmiBug.com
Peter Walen, Software Testing & Anthropology
Rob Sabourin has discovered testing lessons in the Simpsons, the
Looney Tunes gang, Great Detectives, Dr. Seuss, and other unlikely
places, but this year he journeys to Sesame Street. Sesame Street
teaches basic life skills in a safe, entertaining, memorable style. Rob
uses them to solve stubborn technical, management, and peoplerelated testing problems. Oscar the Grouch guides us through failure
mode analysis. Ernie and Bert help us tackle problems from different
perspectives. Big Bird and Mr. Snuffleupagus teach about persistence,
rhetoric, and bug advocacy. The Count misdirects management with
fallacious metrics. And Kermit demonstrates that it is not easy being a
tester, but we can make a difference by getting the right things done
well. Sesame Street songs teach testing lessons, too. Rob performs
a powerful affinity analysis singing “One of these things…”. Enjoy
testing lessons brought to you by Rob and and your friends at
Sesame Street.
Pete Walen is not going to tell you how to be a good test manager.
Instead, Pete shares ideas on becoming a true leader. While some
managers certainly are leaders, testers of all varieties and experience
levels can become leaders. Developing technical leadership skills,
regardless of job title, involves overcoming our own uncertainties,
self-doubts, and perceptions. Learning to foster relationships while
perfecting our craft is a challenge for everyone, particularly when
others look to us to be an expert—even when we don’t feel like
one. Pete presents choices, options, and paths available to software
professionals, including opportunities for self-education, networking,
and other professional and technical development. He describes
how he learned to apply these lessons in day-to-day work situations,
building skills for himself and his co-workers. In this interactive
discussion, Pete shares his mistakes and successes, what he learned
from each, and what opportunities there are for you to grow as a
leader in your own right.
W3
TEST AUTOMATION
Why Automation Fails—in Theory and Practice
Jim Trentadue, Ranorex
Testers face common challenges in automation. Unfortunately, these
challenges often lead to subsequent failures. Jim Trentadue explains
a variety of automation perceptions and myths—the perception that
a significant increase in time and people is needed to implement
automation; the myth that, once automation is achieved, testers will
not be needed; the myth that scripted automation will serve all the
testing needs for an application; the perception that developers
and testers can add automation to a project without additional
time, resources, or training; the belief that anyone can implement
automation. The testing organization must ramp up quickly on the
test automation process and the prep-work analysis that needs to
be done including when to start, how to structure the tests, and
what system to start with. Learn how to respond to these common
challenges by developing a solid business case for increased
automation adoption by engaging manual testers in the testing
organization, being technology agnostic, and stabilizing test scripts
regardless of applications changes.
18
CALL
888.268.8770
OR
904.278.0524
W6
SPECIAL TOPICS
Testing Compliance with Accessibility
Guidelines
Anish Krishnan, Hexaware Technologies, Ltd.
Currently, 2.4 billion people use the Internet, and about 10 percent
of the world’s population has some form of disability. This means
millions of potential users will have difficulty accessing the Internet.
Thus, accessibility testing should not be ignored. Anish Krishnan
discusses the importance of accessibility testing, reasons for
considering accessibility issues while designing, and international
Web accessibility laws. He shares effective techniques for carrying
out accessibility testing, the potential scope of this testing, myths
surrounding accessibility testing, and a set of automated tools to
support this testing. Join Anish to learn about the Section 508
standards and how to test for web accessibility using screen readers
and open source tools. Experience screen reader technology on both
an accessible and non-accessible site. Learn how your test team can
be advocates of accessible websites throughout the project lifecycle
and add accessibility testing to your testing capabilities.
TO
REGISTER
•
S TA R W E S T. T E C H W E L L . C O M
C O N C UR R E N T SE SSIONS
WEDNESDAY, OCTOBER 15, 1:45pm
W7
TEST MANAGEMENT
W10
AGILE TESTING
The Role of Testing: Quality Police or
Quality Communicator?
Agile Development and Testing in a
Regulated Environment
Mike Duskis, 10–4 Systems
John Pasko, Karl Storz Imaging
An underwear advertisement in 1985 featured the dedicated and
thorough Inspector 12 saying, “They don’t say Hanes until I say they
say Hanes.” Historically, software testers have been called on to
perform a similar role—preventing defective products from reaching
customers. However, software development is not underwear
manufacturing. The specifications are less clear and the acceptance
criteria more complex. Why then do organizations continue to place
acceptance decisions in the hands of testers? Because they lack
the information required to make a sound decision. Mike Duskis
presents one way out of this mess—empower the organization to
make acceptance decisions with confidence. This requires a move
away from producing binary pass/fail test results toward gathering,
organizing, and providing the information which the business needs
to assess product risk and quality. Learn strategies and techniques
you can use to stop playing the inspector role and begin to position
yourself as a provider of critical information.
One of the Agile Principles states that working software is the
primary measurement of success—generally measured by the
level of customer satisfaction. So, how do you measure “customer
satisfaction” when it is based on successful surgical outcomes? Join
John Pasko as he takes you through a case study of the design,
development, testing, and release of a complex system—integrating
embedded software with hardware—for a surgical product which
met stringent FDA standards and regulations. Cross-functional teams
comprised of Product Owners, software engineers, and QA engineers
used agile, TDD, continuous integration, and automated and manual
acceptance testing to create iterative in-house releases to our Product
Owner—the internal customer. When all requirements and standards
were satisfied, we released the completed product for use in medical
facilities. Learn how we satisfied regulatory requirements and
provided an audit trail by using a formal tool to map requirements,
handle change requests, and monitor defects and their corresponding
fixes.
W8
TEST TECHNIQUES
Virtualization: Improve Speed and Increase
Quality
W11
PERSONAL EXCELLENCE
Adventures of a Social Tester
Clint Sprauve, HP
Martin Nilsson, House of Test
Many development and test organizations must work within the
confines of compressed release cycles, various agile methodologies,
and cloud and mobile environments for their business applications.
So, how can test organizations keep up with the pace of development
and increase the quality of their applications under test? Clint Sprauve
describes how service virtualization and network virtualization can
help your team improve speed and increase quality. Learn how to use
service virtualization to simulate third-party or internal web services to
remove wait times and reduce the need for high cost infrastructures
required for testing. Take back techniques for incorporating network
virtualization into the testing environment to simulate real-world
network conditions. Learn from Clint how the combination of service
and network virtualization allows teams to implement a robust and
consistent continuous testing strategy to reduce defects in production
applications.
If we know that good co-worker relationships can positively impact
our success, why don’t we take a systematic approach to relationship
building? Martin Nilsson shares how building personal relationships
has helped develop his personal competency. Even though Martin’s
technical skills are high, his greatest successes as a tester have come
from his ability to build relationships. He shares how a focused effort
at building rapport resulted in greater cooperation. When he was
mistaken for a test lead during a project, Martin learned that having
coffee with someone can trump an email. Today, to understand the
systems with which he works, he uses tools that map team members
and their interactions. Martin shares how he applied those tools as a
project test manager to understand the situation of a group of test
leads and managers and remedy the problems that were keeping
them from working together effectively.
W9
Functional Testing with Domain-Specific
Languages
Developing high-quality software requires effective communication
among various project stakeholders. Business analysts must elicit
customer needs and capture them as requirements, which developers
then transform into working software. Software test engineers
collaborate with business analysts, domain experts, developers, and
other testers to validate whether the software meets the customer’s
expectations. Misunderstandings between different stakeholders
can introduce defects into software, reducing its overall quality and
threatening the project’s success. Domain-specific languages (DSLs) are
special purpose languages created to describe tasks in a particular field.
DSLs provide stakeholders with a common vocabulary for describing
application elements and behaviors. Tariq King describes how DSLs
can be leveraged during functional testing to help identify potential
issues early and reduce misunderstanding. Tariq demonstrates how a
well-designed, testing DSL allows non-technical stakeholders to read
and write automated tests, better engaging them in software testing
activities. Learn how DSL-based testing tools can be used to improve
test case management, regression testing, and test maintenance.
888.268.8770
OR
904.278.0524
SPECIAL TOPICS
Martin Pol, Polteq Testing Services BV
Tariq King, Ultimate Software
CALL
W12
Test Improvement in Our Rapidly Changing
World
TEST AUTOMATION
TO
In organizations adopting the newest development approaches,
classical test process improvement models no longer fit. A more
flexible approach is required today. Solutions like SOA, virtualization,
web technology, cloud computing, mobile, and the application
of social media have changed the IT landscape. In addition, we
are innovating the way we develop, test, and manage. Many
organizations are moving toward a combination of agile/Scrum,
context-driven testing, continuous integration and delivery, DevOps,
and TestOps. Effective test automation has become a prerequisite for
success. And all of these require a different way of improving testing,
an adaptable way that responds to innovations in both technology
and development. Martin Pol shares a roadmap that enables you
to translate the triggers and objectives for test improvement into
actions that can be implemented immediately. Learn how to achieve
continuous test improvement in any situation, and take away a
practical set of guidelines to enable a quick start.
REGISTER
•
S TA R W E S T. T E C H W E L L . C O M
19
C O NCU R R E NT S E SSIO N S
WEDNESDAY, OCTOBER 15, 3:00pm
W13
TEST MANAGEMENT
W16
AGILE TESTING
The Test Manager’s Role in Agile: Balancing
the Old and the New
Your Team’s Not Agile If You’re Not Doing
Agile Testing
Mary Thorn, ChannelAdvisor
Jeanne Schmidt, Rural Sourcing, Inc.
What do test managers do? In traditional organizations, they assign
people to projects, oversee the testers’ progress, provide feedback,
and perhaps offer coaching to people who want it. Test managers
are the go-to people when you don’t know how to do something—
not because they know, but because they know who does know.
How does that change with a transition to agile? Do we still need
test managers? As one who has successfully made the transition
from traditional to agile test manager, Mary Thorn shares keys to the
transition. Explore why establishing a mission, vision, and strategy
for your agile test team is vital. Learn why cross-organizational
transparency, communication, and bridge-building become your
prime responsibilities. Review models for building great agile test
teams—teams who successfully balance old and new techniques
to provide customer value. In the end, Mary inspires you to reach a
higher level of test leadership.
Many organizations adopt agile software development processes,
yet they do not adopt agile testing processes. Then they fall into
the trap of having development sprints that are just a set of miniwaterfall cycles. Some software developers still feel they can work
more quickly if they let QA test after code is completed. Jeanne
Schmidt identifies simple ways to get your team to adopt agile testing
methods. Embracing agile testing requires you to change processes,
responsibilities, and team organization. Jeanne details specifically how
agile testers can add value by participating both at the beginning of
each iteration and at the end of each sprint. She describes different
ways you can pair your team members and different techniques
for teaching developers the value of testing. Finally, Jeanne offers
solutions for managing resistance to change and leading all team
members to take responsibility for the product quality.
W14
W17
TEST TECHNIQUES
Testing the New Disney World Website
Les Honniball, Walt Disney Parks and Resorts Technology
At Walt Disney Parks and Resorts Technology, we provide the
applications and infrastructure our online guests use to plan, book,
explore, and enjoy their stay at our parks and resorts. With millions
of page views per day and a multi-billion dollar ecommerce booking
engine, we face a unique set of challenges. Join Les Honniball for
insights into how they work with Product Owners and development
teams to design tests, both manual and automated for these
challenges. Les explains the testing processes that support a
global set of brands on one web platform, including successful QA
strategies, analytics, and user experience design—all while working
within an agile development process. Discover how Les and his team
of QA engineers work with various development teams in Orlando FL,
Glendale CA, and Argentina to support many areas of Walt Disney
Parks and Resorts Technology Business.
W15
End-to-End Test Automation with Open
Source Technologies
Ramandeep Singh, QA InfoTech
As organizations continue to adopt agile methodologies, testers
are getting involved earlier in product testing. They need tools
that empower them to manage varied test automation needs for
web services, web APIs, and web and mobile applications. Open
source solutions are available in abundance. However, most of these
solutions are independent and not integrated, significantly increasing
the tester’s work around test automation development. Ongoing test
automation suite evolution and building a robust regression test suite
have become cumbersome. Join Ramandeep Singh as he shares
the idea of a comprehensive end-to-end automation framework to
minimize the efforts spent in using existing test automation solutions
across all aspects of the application. Take back techniques to create
effective automated tests that are robust and reusable across multiple
forms of the same application. Learn to help functional testers
effectively use test automation, simplified through a comprehensive
framework, to efficiently build automated test cases.
20
CALL
888.268.8770
Speak Like a Test Manager
Mike Sowers, Software Quality Engineering
Ever feel like your manager, development manager, product manager,
product owner, or (you fill in the blank) is not listening to you or your
team? Are you struggling to make an impact with your messages?
Are you “pushing a wet rope uphill” in championing product quality?
Are you talking, but no one is listening? Mike Sowers shares practical
examples of how to more effectively speak like a test manager and
offers concrete advice based on his experiences in the technology,
financial, transportation, and professional services sectors. Mike
discusses communication and relationship styles that work—and
some that have failed—and shares key principles (e.g., seeking to
understand), approaches (e.g., using facts), and attributes (e.g., being
proactive) to help you grow and prosper as a test manager. Leave
with practical ideas to boost your communications skills and influence
to become a trusted advisor to your team and your management. W18
TEST AUTOMATION
OR
904.278.0524
PERSONAL EXCELLENCE
SPECIAL TOPICS
Implementing Outsourced Testing Services
with a Third Party
Shelley Rueger, Moxie Software
Outsourcing test services are all the rage today. But are they really
faster, better, and cheaper? Shelley Rueger shares how you can
improve the efficiency and effectiveness of your test process using a
third-party test service. She provides guidance on how to determine
if your product is a good candidate for testing services, how to select
the right vendor, and how to avoid common pitfalls. Shelley discusses
her team’s experience as they made the transition from in-house
testing to using external testing services. She addresses questions
including: When should you outsource testing? When should you
not? What questions should you ask a test services company before
engaging them? What issues should you keep an eye out for during
the transition? Leave with an actionable plan for implementing
successful third-party testing within your organization—or come away
with the knowledge that it will not be right for you.
TO
REGISTER
•
S TA R W E S T. T E C H W E L L . C O M
C O N C UR R E N T SE SSIONS
THURSDAY, OCTOBER 16, 9:45am
T1
TEST MANAGEMENT
T4
MOBILE TESTING
“Rainmaking” for Test Managers
Top Ten Attacks to Break Mobile Apps
Julie Gardiner, Redmind
Jon Hagar, Grand Software Testing
The dictionary defines a rainmaker as “an executive (or lawyer) in the
unsentimental world of business with an exceptional ability to attract
clients, use political connections, increase profits, etc.” Simply put,
a rainmaker is someone who gets things done. Is this relevant to
testing? Absolutely! It is too easy to get stuck in the status quo and
to avoid trying something new because everything works well as it is.
But what we do can always be made better. Julie Gardiner focuses
on two key areas—(1) becoming a Trusted Advisor, and (2) adapting
rainmaking principles to the testing role. Join Julie as she discusses
rainmaking topics: The power of relationships—What is your market?
Credibility—What is it? How do we get it? and The Platinum Rule—
why should we always follow it? If you are looking for ways to enhance
your ability to make testing—and your company—work even better,
then this session is for you!
To aid development in the mobile and smartphone app world, testers
must do more than simply test against requirements; they should
include attack-based testing to find common errors. In the tradition of
James Whittaker’s How to Break Software books, Jon Hagar applies
the testing “attack” concept to mobile app software, defines the
domain of mobile app software, and examines common industry
patterns of product failures. Jon then shares a set of ten software
test attacks, based on the most common modes of failure in native,
web-based, and hybrid apps. Developers and testers can use these
attacks against their own software to find errors more efficiently. Jon
describes why each attack works with its pros and cons. He provides
information on how attacks can be used to cover many different
quality attributes beyond testing only functionality.
T2
Jeff Payne, Coveros, Inc.
Gareth Bowles, Netflix
The cloud is all about redundancy and fault tolerance. Since no single
component can guarantee 100 percent uptime, we have to design
architectures where individual components can fail without affecting
the availability of the entire system. But just designing a fault tolerant
architecture is not enough. We have to constantly test our ability to
actually survive these “once in a blue moon” failures. And the best
way is to test in an environment that matches production as closely
as possible or, ideally, actually in production. This is the philosophy
behind Netflix’ Simian Army, a group of tools that randomly induces
failures into individual components to make sure that the overall
system can survive. Gareth Bowles introduces the main members of
the Simian Army―Chaos Monkey, Latency Monkey, and Conformity
Monkey. Gareth provides practical examples of how to use them in
your test process—and, if you’re brave enough, in production.
Test automation is difficult to get right. Working under FDA regulation
presents its own challenges. Combining the two is a scary proposition
because the FDA requires—and will scrutinize—the validation of
any test automation used. Despite this, working in a regulated
environment only magnifies the value of test automation. Aware that
automation is a driver of quality and consistency, the FDA welcomes
automated tests as part of an audit submission. The key to success
is demonstrating quality in a way that the FDA recognizes. Chris
Crapo and David Nelson lay out the road map to validation of a test
automation system and highlight the critical thinking, planning, and
types of maintenance that form the core of any successful validation
strategy. By understanding the focal points of validation, you can
set your project up for regulatory success while maintaining a lean,
focused execution that drives results, not paperwork.
OR
904.278.0524
T6
METRICS
Deborah Kennedy, Aditi Technologies
Chris Crapo and David Nelson, Boston Scientific Neuromodulation
888.268.8770
DevOps is gaining popularity as a way to quickly and successfully
deploy new software. With all the emphasis on deployment, software
quality can sometimes be overlooked. In order to understand how
DevOps and software testing mesh, Jeff Payne demonstrates a
fully implemented continuous integration/continuous delivery (CI/
CD) stack. After describing the internals of how CI/CD works, Jeff
identifies the touch points in the stack that are important for testing
organizations. With the now accelerated ability to deliver software,
the testing groups need to know how this technology works and what
to do with it because swarms of manual testers will not be able to
keep up. Jeff demonstrates where and how to use automated testing,
how to collect and make sense of the massive amount of test results
that can be generated from CI/CD, and how to usefully apply manual
testing.
Testers, Use Metrics Wisely or Don’t Use
Them at All
TEST AUTOMATION
A Path through the Jungle: Validating a
Test Automation System for the FDA
CALL
CONTINUOUS DELIVERY
Using DevOps to Improve Software Quality
in the Cloud
TEST TECHNIQUES
Release the Monkeys: Testing Using the
Netflix Simian Army
T3
T5
TO
For thousands of years, human language has provided us with
beautiful and complex ways of sharing important ideas. At the same
time, language can derail attempts to communicate even the most
basic pieces of critical information. We testers are the heralds of vast
amounts of data, and it is our responsibility to use that data wisely—
or not at all. Whether you are the information presenter whose voice
not being heard or the information receiver who needs ways to spot
errors in the message, a review of how metrics can be skewed—
through ignorance, bias, or malice—provides us with the ability to
think beyond content to the ethics of presentation. Using scientific
research, case studies, and an interactive “try it yourself” experience,
Deborah Kennedy explores both sides of metric—the good and the
bad. Take away key insights to present your message without built-in
barriers and arm yourself against disreputable attempts to sway you
with unwisely presented data.
REGISTER
•
S TA R W E S T. T E C H W E L L . C O M
21
C O NCU R R E NT S E SSIO N S
THURSDAY, OCTOBER 16, 11:15am
T7
TEST MANAGEMENT
T10
MOBILE TESTING
Leading Internationally-Distributed Test
Teams
Bridging the Gap in Mobile App Quality
Dennis Pikora, Symantec
Today, an alarming 65 percent of mobile apps—more than
1.3 million—have a 1-star rating or less. Why? The majority of
development organizations have neither the right processes nor
access to the devices required to properly test mobile applications.
If not addressed, these deficiencies will have a major impact on the
quality of the apps the organization develops. In addition, users
are intolerant of problems and quick to switch to competing apps.
Costa Avradopoulos explores how to address the unique challenges
of mobile testing, starting with adopting the right test strategy.
Costa describes the top challenge test leaders face today—how to
design a proper test lab, given thousands of unique mobile devices.
Costa shares insight into choosing the right devices to optimize
test coverage and reduce risks. He also shows you how to leverage
existing tools and evaluate automation options to keep your team
current with the faster pace of mobility.
Are you employing your offshore test team to its best advantage—
gaining the cost savings and test coverage you expected? Unless
correct management methodologies are in place, you will lose rather
than gain both time and money with internationally-distributed
testers. If you are thinking you can go offshore with minimal effort,
think again. Distributed test leadership and management issues
apply when working with third-party firms, a subsidiary, or even your
own employees. Don’t let unrealistic expectations impact your career
or your company’s goals. Learn methodologies such as site mirroring,
managing Scrum of Scrum meetings, and the value of physical
presence. Become aware of labor laws and cultural differences.
Ensure the best selection of employees at offshore sites. If you
want to successfully manage your distributed international teams
and avoid the pitfalls that plague many firms, join Dennis Pikora as
he discusses the methodologies that enabled the efficiency of his
worldwide teams.
T8
TEST TECHNIQUES
A Feedback-Driven Framework for Testing
Erik Petersen, emprove
How do you do exploratory testing? How do you discover a defect
and explore its extent? How do you manually test a feature and its
variations, or an entire release? Can you describe what you do? Erik
Petersen shares an approach that has worked for him and many
others―a feedback-based framework. Come see it in use, and then
take it away to use on your projects as an action guide, a training
guide, and a way to focus your testing skills. Take an unexpected
journey with Erik as he shares a novel approach to thinking about
testing―collating learning, prioritizing tasks, investigating areas of
interest, experimenting within them, and then cycling through each
stage based on the ongoing discoveries along the way. Using real life
examples from all corners of the (middle) earth, discover how it works
with exploratory testing, defect investigations, and test scoping. T9
TEST AUTOMATION
Costa Avradopoulos, Capgemini Consulting
T11
CONTINUOUS DELIVERY
Checking Performance along Your Build
Pipeline
Andreas Grabner, Compuware
Do you consider the performance impact when adding a new
JavaScript file, a single AJAX call, or a new database query to your
app? Negligible, you say? I disagree—and so should you. Andreas
Grabner demonstrates the severe impact small changes can have
on performance and scalability. Many small changes will have an
even bigger impact so it is important to catch them early. If you are
working with a delivery pipeline, make sure to look into performance,
scalability, and architectural metrics such as the number of resources
on your page, size of resources, number of requests hitting your web
servers, database statements executed, and log messages created.
Monitoring these allows you to add a new quality gate to your
delivery pipeline and prevents major problems. Andi shares a handful
of metrics to teach to your developers, testers, and operations folks,
and explains why they are important to performance.
T12
METRICS
Automation Abstractions: Page Objects and
Beyond
Metrics That Matter
Alan Richardson, Compendium Developments
Imagine you’re a test manager starting a new assignment. On the first
day of work, you’re presented with a list of metrics you are to report.
Soon, you realize that most of the metrics are not really connected
to what should be measured. Or, consider the situation where you’re
told that there is no value collecting metrics because “we’re agile.”
In either situation, what would be your next step be? Join Pablo
Garcia as he shares his experience with the dangers of poor metrics.
Believing that some metrics can have value in helping testing be
effective and efficient, Pablo shares his favorite metrics including
a couple of crazy ones—requirements coverage, defect detection
percentage, faults in production, and cost per bug. Each is discussed,
evaluating what it really measures, when to use it, and how to present
it to send the correct message. Take back a toolbox of testing metrics
that will make your testing role easier. When you start writing automation for your projects, you quickly
realize that you need to organize and design the code. You will
write far more than “test” code; you also will write abstraction code
because you want to make tests easier to read and maintain. But
how do you design all this code? How do you organize and structure
it? Should you use a domain specific language? Should you go
keyword driven or use Gherkin? Should you use page objects with
POJO or Factories? Do you create DOM level abstractions? Where
do domain models fit in? Alan Richardson provides an overview of
options available to you when modeling abstraction layers. Based
on his experience with many approaches on real-world commercial
projects, Alan helps you understand how to think about the modeling
of abstraction layers. Illustrated with a number of code examples, Alan
shows you a variety of approaches and discusses the pros and cons
associated with each.
22
CALL
888.268.8770
OR
904.278.0524
Pablo Garcia, Redmind
TO
REGISTER
•
S TA R W E S T. T E C H W E L L . C O M
C O N C UR R E N T SE SSIONS
THURSDAY, OCTOBER 16, 1:30pm
T13
TEST MANAGEMENT
T16
MOBILE TESTING
The Unfortunate Triumph of Process over
Purpose
Ensuring the Performance of Mobile Apps—
on Every Device and Network
James Christie, Claro Testing
Steve Weisfeldt, Neotys
As a test manager, James Christie experienced two divergent views
of a single project. The official version claimed that planning and
documentation were excellent, with problems discovered during test
execution being managed effectively. In fact, the project had no useful
plans, so testers improvised test execution. Creating standardized
documentation took priority over preparing for the specific
problems testers would actually face during testing. The required
documentation standards didn’t assist testing; they actually hindered
by distracting from relevant, detailed preparation. It was a triumph of
process over purpose. James shows that this is a problem that testing
shares with other complex disciplines. Devotion to processes and
standards inhibits creativity and innovation. They provide a comfort
blanket and a smokescreen of “professionalism” where following the
ritual becomes more important than accomplishing the goals. Unless
we address this issue, organizations will question whether testers
really add value. Testers must respond by challenging unhelpful
processes and the culture that encourages them. Purpose must come
before process!
Applications today are accessed over myriad network configurations—
wired, wireless, and mobile networks. Deployed applications may
deliver different content and functionality depending on whether
the user is accessing it via a browser, smartphone, or tablet. Steve
Weisfeldt explains how these combinations significantly impact the
performance of applications, creating a previously unseen set of
testing challenges. A crucial part of the load testing process is being
able to emulate network constraints, change connection speeds,
and control parameters such as packet loss and network latency to
test in the most realistic scenarios. Learn the ramifications of these
new technologies and constraints on testing for mobile application
performance. Join Steve in discussing approaches and considerations
to ensure that high application performance is delivered to all endusers—all the time—regardless of device or network.
T14
TEST TECHNIQUES
Jim Hirschauer, AppDynamics
The software development lifecycle is a pretty complex process
in many organizations. However, by using monitoring tools and
methodologies, you can accelerate testing and release higher quality
code—the cornerstone of rapid software delivery. These tools provide
immediate feedback with actionable information so you can address
problems as they are detected instead of waiting until the end of a
testing cycle. Earlier detection, combined with tests that are a better
representation of production workloads, are key to releasing better
code, faster. Jim Hirschauer shows how to use monitoring software to
make a major impact during development, test, and production. He
describes typical use cases for server monitoring, log monitoring, and
application performance monitoring. Learn about open source testing
tools including Siege, Multi-Mechanize, and Bees with Machine Guns.
Understand how to use each of these tools and more in development,
test, and production as well as creating a feedback loop that drives
continuous improvement.
Subodh Parulekar, AFour Technologies, Inc.
Business analysts, developers, and testers are sometimes not on
the same page when it comes to test automation. When there is no
transparency in test cases, execution, coverage, and data, review
of automation by all stakeholders is difficult. Making automation
scripts easily readable and writable allows stakeholders to better
participate. Subodh Parulekar describes how his team dealt with
these issues. Learn how they leveraged behavior-driven development
(BDD) concepts and put a wrapper around their existing automation
framework to make it more user-friendly with the easy to understand
Given-When-Then format. Subodh discusses how his team
implemented the new approach in four months to automate 700+
test cases. Now, test reports contain the actual Gherkin test step that
passed or failed allowing any stakeholder to evaluate the outcome.
Learn how stakeholders can rerun a failed test case from the reporting
dashboard to determine if the failure is related to a synchronization,
environmental, functional, or test data problem. 888.268.8770
Build Your Custom Performance Testing
Framework
Performance testing requires knowledge of systems architecture,
techniques to simulate the load equivalent of sometimes millions of
transactions per day, and tools to monitor/report runtime statistics.
With the evolution from desktop to web and now the cloud,
performance testing involves an unparalleled combination of different
workloads and technologies. There is no one tool available—either
commercial or open source—that meets all performance testing
needs. Some tools act as load generators; others only monitor
system resources; and many only operate for specific applications
or environments. Prashant Suri shares the essential components you
need for a comprehensive performance test framework and explores
why each component is required for a holistic test. Learn how to
develop your custom framework—starting with parsing test scripts in
a predefined format, iterating over test data, employing distributed
load generators, and integrating test monitors into the framework.
Discover how building your own framework gives you flexibility to
challenge multiple performance problems—and save thousands of
dollars along the way.
T18
TEST AUTOMATION
Making Your Test Automation Transparent
CALL
PERFORMANCE TESTING
Prashant Suri, Rackspace
Speed Up Testing with Monitoring Tools
T15
T17
OR
904.278.0524
TO
SECURITY
Testing Application Security: The Hacker
Psyche Exposed
Mike Benkovich, Imagine Technologies, Inc.
Computer hacking isn’t a new thing, but the threat is real and
growing even today. It is always the attacker’s advantage and the
defender’s dilemma. How do you keep your secrets safe and your
data protected? In today’s ever-changing technology landscape,
the fundamentals of producing secure code and systems are more
important than ever. Exploring the psyche of hackers, Mike Benkovich
exposes how they think, reveals common areas where they find
weakness, and identifies novel ways to test your defenses against their
threats. From injection attacks and cross-site scripting to security misconfiguring and broken session management, Mike examines the top
exploits, shows you how they work, explores ways to test for them,
and then shares what you can do to help your team build more secure
software in the future. Join Mike and help your company avoid being
at the center of the next media frenzy over lost or compromised data.
REGISTER
•
S TA R W E S T. T E C H W E L L . C O M
23
C O NCU R R E NT S E SSIO N S
THURSDAY, OCTOBER 16, 3:00pm
T19
TEST MANAGEMENT
T22
MOBILE TESTING
Before You Test Your System, Test Your
Assumptions
Five Ways to Improve Your Mobile Testing
Aaron Sanders, Agile Coach
Few technology shifts have impacted the way we do business as
much as mobile. The new and exciting functionality delivered by
mobile apps, the pace at which they are being developed, and their
emergence as the “face of the business” requires that organizations
deliver unprecedented quality in these software systems. Join Dennis
Schultz to learn how leading enterprises are approaching their mobile
application testing challenges and how they have integrated mobile
into their existing processes. Dennis describes the importance of
testing on real devices, the value of using emulators to supplement
your testing strategy, how to optimize your testing with real devices
using SaaS remote device services, how to automate your repetitive
tests to speed time to market and improve quality, and how to
support a collaborative work environment and efficient test process
for mobile development.
Do you find yourself discussing with your peers what you think the
system you’re building should do? Do you argue over what the
users want? Do discussions wind up in a heated debate? This result
indicates that no shared understanding exists about the system. With
a lack of shared understanding, it’s easy to fall into the trap of making
assumptions about system functionality, who the users will be, and
how to build the system. These assumptions introduce errors into the
requirements and design—long before a single line of code is written.
Creating a shared understanding among stakeholders, users, and
teams reduces the chances of not building the right thing—as well as
not building the thing right. Aaron Sanders describes the techniques
of experimental design, story mapping, user research, prototyping,
and user acceptance testing that he’s used to help teams build a
shared understanding. Learn to test your assumptions as rigorously as
you test the system itself.
T20
TEST TECHNIQUES
User Acceptance Testing in the Testing
Center of Excellence
Deepika Mamnani, Hexaware Technologies
Centralization of testing services into a testing center of excellence
(TCoE) for system testing is common in IT shops today. To make this
transformation mature, the next logical step is to incorporate the user
acceptance testing (UAT) function into the TCoE. This poses unique
challenges for the TCoE and mandates the testing team develop a
combination of business process knowledge coupled with technology
and test process expertise. Deepika Mamnani shares her experiences
in implementing a UAT TCoE and best practices—from inception to
planning to execution. Learn techniques to create business-oriented
testable requirements, strategies to size and structure the team, and
the role of automation. Review testing metrics needed to measure the
success of the UAT function. Hear a real-world transformation journey
and the quantitative business benefits achieved by an organization
incorporating UAT as a centralized function within the TCoE. Take
back strategies to incorporate UAT as a part of your TCoE.
T21
Seretta Gamba, IBM, and Steria Mummert, ISS GmbH
When doing test automation, you sometimes notice that things are
not working out as expected, but it’s not clear why. You are so caught
up in the day-to-day work that you don’t see the bigger picture.
It’s like when you get sick—you know something is wrong, but you
don’t know what. That’s the time to visit a doctor. Doctors diagnose
diseases mainly by asking questions. First, they get a general
idea of what’s wrong; then the questions become more and more
specific; and in the end, they identify the disease and prescribe the
appropriate cure. This method also works well for test automation. By
first asking general questions, and then more and more specific ones,
you can identify the disease (the issue) and then it’s relatively simple
to select the most appropriate remedy. Seretta Gamba demonstrates
this method with examples of common automation diseases and
suggests the appropriate patterns to cure them.
CALL
888.268.8770
T23
OR
904.278.0524
PERFORMANCE TESTING
Modeling System Performance with
Production Data
William Hurley, Astadia
When creating system performance models, the primary challenges
are where and how to start. Whatever the performance characteristics
being estimated or modeled, we need a solid approach that
addresses both business and system needs. All too often performance
tests inadvertently mix load and stress scenarios with little regard for
how this will confound recommendations and business decisions.
If you are a test manager, a business process owner, or you simply
want to better understand performance testing, you will be
interested in William Hurley’s case studies. Will presents two realworld examples that demonstrate the impact on business decisions,
and show how to use production data and statistical modeling to
improve both the analysis and business decisions. The first study is
a cloud-based performance testing engagement. The second is a
back-end re-hosting study where acceptance criteria was based on
an achieving “equal or faster” performance. Take away new insights
and approaches to improve performance-based decisions for your
organization.
T24
TEST AUTOMATION
The Doctor Is In: Diagnosing Test
Automation Diseases
24
Dennis Schultz, IBM
SECURITY
Testing API Security: A Wizard’s Guide
Ole Lensmar, SmartBear Software
As we’ve seen in recurring events in the past year, web services APIs
are a primary target for security attacks—and the consequences
can be catastrophic for both API providers and end users. Stolen
passwords, leaked credit card numbers, and revealed private
messages and photos are just some of the headaches awaiting those
who have been compromised. Ole Lensmar puts on his hacker-cloak
to show how attackers break systems via web service APIs with
fuzzing, session spoofing, injection attacks, cross-site scripting, and
other methods. Learn how these attacks actually work on an API and
how we can test an API to make sure it isn’t vulnerable—without
compromising the API at the same time. Find out the roles various
security-related standards play and how they affect testing. Come and
find out. You can’t afford not to.
TO
REGISTER
•
S TA R W E S T. T E C H W E L L . C O M
BONUS SESSIONS
Speaking 101: Tips and Tricks
Lee Copeland, Software Quality Engineering
Tuesday, October 14 • 6:30pm–7:30pm
The Workshop on Regulated
Software Testing (WREST)
Back by Popular Demand!
Are you a new STAR speaker or
aspiring to be one in the future? Join
us at this workshop on making effective
conference presentations. Learn
the secrets of developing content,
identifying the Big Message, preparing
slides with just the right words and
Lee Copeland
images, presenting your message,
handling questions from the audience, and being ready
when things go wrong. Lee Copeland, a professional
speaker since birth, shares ideas that will help you be a
better speaker, no matter what the occasion.
Service Virtualization and
Continuous Integration
John McConda, Moser Consulting, and Griffin
Jones, Congruent Compliance, LLC
Friday, October 17 • 8:30am–4:30pm
Join us at The Workshop
on Regulated Software
Testing (WREST)—a free,
full-day bonus session
held on Friday after the
conference concludes. A
unique peer workshop,
John McConda
Griffin Jones
WREST is dedicated to
improving the practice of testing regulated systems. We
define regulated software as any system that is subject to an
internal or external review.
WREST relies on its attendees to make the workshop a success.
There are no formal presentations, only experience reports with
plenty of time designated for facilitated discussion. We hope to
learn from each other by hearing the success and (especially!)
failure stories of real practitioners who test regulated software.
Have a problem you want input on solving? You can bring
that to the workshop as well—just be prepared to participate!
WREST is hosted by John McConda and Griffin Jones.
Sponsored by Cognizant
Wednesday, October 15, 7:15am-8:15am
Join this unique breakfast bonus session featuring
speakers Kannan Subramaniam, VP, TQM, Comcast,
and Premkumar Balasubramanian, Sr. Director, QE&A,
Cognizant Technologies, for an in-depth discussion on
service virtualization and continuous integration.
Limited seats available. Reserve your seat by contacting the Client
Support Group at 888.268.8770 or 904.278.0524 or [email protected].
CONFERENCE BONUS!
om
ober 2013
September/Oct
www.TechWell.c
ES
G TECHNIQU
AGILE HIRIN
ways to
Unconventional and hire
interview
great talent
PREDICTING
OME
PROJECT OUTC
Digital Subscription to Better Software Magazine!
patterns
Using project success
to achieve
STARWEST conference attendees receive a digital subscription to
Better Software magazine. Delivering relevant, timely information,
Better Software magazine helps you tackle the challenges of
building high-quality software, regardless of your role in the software
development lifecycle. www.BetterSoftware.com
CALL
888.268.8770
OR
904.278.0524
TO
REGISTER
•
S TA R W E S T. T E C H W E L L . C O M
25
Testing & Quality Leadership Summit
Thursday, Oct. 16 (evening), and Friday, Oct. 17 (all day)
Test Leadership on the Edge
The role of a software tester continues to change as new software processes and
new technologies drive the need for better, more effective testing approaches. Join
in the conversation with your peers as experienced testing and quality leaders share
ways to lead an organization living on the technical edge. Discover how seasoned
leaders deal with changes in platform technology, development/testing tools, and agile
development methods.
At the 2014 Testing & Quality Leadership Summit, summit chair Jeffery Payne brings together senior industry leaders—Jon Bach,
Rob Sabourin, and others—for an interactive exchange of ideas and experiences. You also won’t want to miss the Test Leader
Rumble—a high energy panel discussion and debate between these cutting-edge test leaders.
Also, bring your biggest issues and challenges to the Testing & Quality Leadership Summit, where you can draw on the
knowledge and experiences of these leaders and your fellow managers who may have already faced and solved some of your
issues. You’ll hear what’s working—and not working—and have the opportunity to share your experiences and successes. The
Testing & Quality Leadership Summit is a perfect opportunity for you to:
•
•
•
•
Participate in insightful and informative sessions focusing on leadership issues
Meet and network with your peers in the industry
Join in the “think tank” discussion with industry veterans
Develop new ideas and action plans for innovation within your organization
Jeff Payne
Coveros, Inc.
Summit Chair
THURSDAY, OCTOBER 16
5:30
Reception—Think Tank Issues Identification: As a Leader, What Is Keeping You Up at Night?
Jeff Payne, Coveros, Inc.
FRIDAY, OCTOBER 17
8:00
8:30
9:30
Registration and Breakfast
Finding Your Edge on the Edge
Jon Bach, eBay, Inc.
Networking Break
9:45
Test Leader Rumble—A Panel Discussion/Debate
Jon Bach, Rob Sabourin, and other test leaders!
10:45
Networking Break
11:00
Think Tank Discussion: Leadership Solution Brainstorm (Part 1)
12:30
Networking Lunch Buffet
1:30
Think Tank Discussion: Presentation of Results (Part 2)
2:30
Wrap-up and Ongoing Informal Discussions with Speakers and Attendees
26
CALL
888.268.8770
OR
904.278.0524
TO
REGISTER
•
S TA R W E S T. T E C H W E L L . C O M
Testing & Quality Leadership Summit Sessions
FRIDAY, OCTOBER 17
8:30am
Finding Your Edge on the Edge
With more than eighteen years of
software testing experience, Jon Bach
has held technical and managerial
positions in companies including HewlettPackard and Microsoft. As director of Live
Site Quality for eBay, Jon is dedicated to building
“end-to-end” tests (activity flows) in eBay’s core sites
to discover important bugs that threaten its core
business. He and his brother James created SessionBased Test Management, a method to manage and
report exploratory testing. Jon frequently speaks at
the STAR conferences and usually can be found
wearing a ball cap, hanging out in the conference
hallways, encouraging others, and sharing best testing
ideas and patterns.
Jon Bach, eBay, Inc.
A good tester can find problems or risks under any condition—compressed time,
no cooperation, or sparse information. The best ones seem to thrive despite
uncertainty and pressure. Why is that? What is their secret? In this interactive talk,
Jon Bach will point out the attributes he’s seen in testers for 20 years (and that are
likely within you, too). What skills can you cultivate to be a more effective leader
despite being under stress? His premise is: “We’re all tested” and even though
testing is about revealing weaknesses—the good news is that it also reveals
strengths. How can you be an effective leader in the face of changing technology,
organizations, and process? Share your advice for how you have coped and even
thrived in a constantly changing technological and company landscape.
9:45am
Test Leader Rumble—
A Panel Discussion/Debate
Rob Sabourin, P. Eng., has more than
thirty years of management experience
leading teams of software development
professionals. A well-respected member of
the software engineering community, Rob
has managed, trained, mentored, and coached
hundreds of top professionals in the field. He
frequently speaks at conferences and writes on
software engineering, SQA, testing, management, and
internationalization. Rob wrote I am a Bug!, the
popular software testing children’s book; works as an
adjunct professor of software engineering at McGill
University; and serves as the principle consultant (and
president/janitor) of AmiBug.Com, Inc. Contact Rob at
Contact Rob at [email protected].
Jon Bach, Rob Sabourin, and other test leaders!
Every leader attacks a problem in a different way. Join three distinguished test
leaders as they discuss and debate how to tackle today’s stickiest test
leadership issues:
•
•
•
•
•
How to motivate testers
How to get what you need from upper level management
Dealing effectively with the software development organization
Addressing morale and performance issues
And more!
Learn how these problems can be attacked in different ways. Take home pragmatic,
proven techniques for addressing test leadership challenges.
11:00am
Think Tank Discussion: Leadership Solution Brainstorm
(part 1)
Jeff Payne is CEO and founder of
Coveros, Inc., a software company that
builds secure software applications using
agile methods. Since its inception in
2008, Coveros has become a market
leader in secure agile principles while being
recognized by Inc. Magazine as one of the fastest
growing private companies in the country. Prior to
founding Coveros, Jeffery was Chairman of the Board,
CEO, and co-founder of Cigital, Inc., a market leader
in software security consulting. Jeff has published over
30 papers on software development and testing as
well as testified before Congress on issues of national
importance, including intellectual property rights,
cyber-terrorism, and software quality.
Jeff Payne, CEO and founder, Coveros, Inc.
Join with your peers in an engaging and highly interactive session to discuss the
issues that affect you most. Using answers to the question “As a Leader, What is
Keeping You Up at Night?” posed at thursday’s evening reception, participants will
form small groups to work on finding solutions to pressing test management issues.
Discussions will review identified issues, barriers to change, and focus on innovative
strategies and practical next steps. at the end of the think tank, all feedback will be
collected and posted online to encourage further collaboration.
1:30pm
Think Tank Discussion: Presentation of Results (part 2)
In the morning think tank discussion you discovered solutions to some of your
most challenging issues. Now each group will present their findings, share their
solutions, and learn from each other. At the end of the think tank, all feedback will
be collected and posted online to encourage further collaboration.
CALL
888.268.8770
OR
904.278.0524
TO
REGISTER
•
S TA R W E S T. T E C H W E L L . C O M
27
VISIT the EXPO
Wednesday, October 15–Thursday, October 16
Discover the Top Technologies and Tools All Under One Roof!
Visit the STARWEST Expo and enjoy all of these unique opportunities:
• The latest solutions in testing technologies, software, and tools
• Meet one-on-one with representatives from some of today’s most innovative organizations
•N
etwork with colleagues and conference speakers while enjoying cocktails and appetizers during the
Expo Reception
• Learn new skills and solutions, and participate in live demos during the industry technical presentations
• Travel the Expo floor for fun games and a chance to win exciting prizes
• Enjoy various session breaks in the Expo with complimentary refreshments to keep you energized!
Unable to join us for the entire week? Request your free 1-day
Expo pass at http://vlt.me/expopass
EXPO HOURS
Wednesday, Oct. 15
10:30am–2:00pm
3:30pm–6:30pm
28
CALL
888.268.8770
Expo Reception
Wednesday 5:30pm–6:30pm
OR
Thursday, Oct. 16
10:30am–3:00pm
All attendees are invited to the
Expo reception for complimentary
food and beverages.
904.278.0524
TO
REGISTER
•
S TA R W E S T. T E C H W E L L . C O M
E X H I B I T O R S and
CONFERENCE SPONSORS
The sponsors below will all be exhibiting at STARWEST. Please come visit each of their booths to meet
one-on-one with representatives from these innovative organizations!
Premier
Sponsor:
Platinum
Sponsors:
Gold
Sponsors:
Silver
Sponsors:
Partners:
SQE TRAINING
For sponsor/exhibitor news and updates, visit starwest.techwell.com.
To become a sponsor/exhibitor, please contact [email protected].
CALL
888.268.8770
OR
904.278.0524
TO
REGISTER
•
S TA R W E S T. T E C H W E L L . C O M
29
SEARCHING FOR
THE BEST DEAL?
Ways to Save on Your Conference Registration
Your Best Value—The Full Conference Package (5 Full Days), including:
THE FORMULA FOR YOUR
•
•
•
•
•
•
•
•
BEST VALUE
Only $2,795
if you register
before September 12
2 Days of Pre-conference Tutorials
2 Days of Concurrent Sessions
1 Full-day of the Testing & Quality Leadership Summit
5 Industry-leading Keynotes
The Expo & Bonus Sessions
All Networking & Special Events
All Continental Breakfasts, Lunches, and Refreshment Breaks
Combine with the other ways to save below for even more value!
EARLY BIRD OFFER
Receive up to $200 off the regular conference registration fee if payment is received on or before
September 12, 2014. (depending on the conference package selected)
GROUPS OF 3 OR MORE SAVE UP TO 30% OFF
Register a group of three or more at the same time and save up to 30% off each registration. To take
advantage of this offer, please call the Client Support Group at 888.268.8770 or 904.278.0524 or email
them at [email protected] and reference promo code GRP3 (See page 31 for details).
ALUMNI DISCOUNT
STAR alumni receive up to an additional $200 discount off their registration fee. (depending on the
conference package selected) If you are a STAR alumni and unable to attend STARWEST this year, you
may pass your alumni discount on to a colleague!
MULTI-DAY TRAINING CLASS + CONFERENCE
Save an additional $300 when you attend any of the multi-day training classes and the conference
(discount already reflected in the conference pricing).
Please Note—We will always provide the highest possible discount and allow you to use the two largest discounts that apply to your registration.
30
CALL
888.268.8770
OR
904.278.0524
TO
REGISTER
•
S TA R W E S T. T E C H W E L L . C O M
STARWEST REGISTRATION INFORMATION
OCTOBER 12–17, 2014 ANAHEIM, CA, USA
CONFERENCE
PRICING
Early Bird on or
After
Registration Fees:*
o Best Value Package (Mon–Fri)
starwest.techwell.com
PHONE:
CONFERENCE
Easy to Register
ONLINE:
before September 12
888.268.8770
904.278.0524
September 12
BEST
$2,795$2,995
Includes 2 days of Pre-conference Tutorials,
2 Conference Days, and Testing &
Quality Leadership Summit
o Conference + 2 Tutorial Days
o Conference + 1 Tutorial Day
o Conference Only (Wed–Thur)
o 2 Tutorial Days
o 1 Tutorial Day o Testing and Quality Leadership Summit
o Add
Testing & Quality Leadership Summit (Friday)
$2,495
$2,145
$1,895
1,745
$945
$945
VALUE!
$2,295
$1,995
$1,795
$995
$995
to any Conference package$595
$595
o S oftware Tester Certification—Foundation Level
EMAIL:
[email protected]
TRAINING
Training + Conference (includes $250 fee for ISTQB exam)
$3,840
$3,940
oR
eal-World Software Testing with Microsoft
Visual Studio® + Conference
$3,590$3,690
$3,090$3,190
o Requirements-Based Testing + Conference
o Mastering HP LoadRunner® for Performance
+ Conference
$3,590$3,690
o Agile Tester Certification—ICAgile + 1 Tutorial Day
+ Conference
$3,340$3,490
o Fundamentals of Agile Certification—ICAgile
+ 1 Tutorial Day + Conference
$3,340$3,490
o Mobile Application Testing + 1 Tutorial Day
+ Conference
$3,340$3,490
BRING YOUR TEAM AND SAVE UP TO 30% ON EACH REGISTRATION!
See how much savings groups of 3+ can enjoy on one of our most popular conference packages: Conference + 2 Tutorial Days.
Number of
Team Members
Regular Pricing
1-2
$2,495
$2,395
3-9
$1,996
$1,876
20%
10-19
$1,871
$1,759
25%
20+
$1,746
$1,642
30%
Early Bird Pricing
(by September 12, 2014*)
Group Savings
*Full payment
must be
received by
deadline date
PAYMENT INFORMATION
The following forms of payment are accepted: Visa, MasterCard, Discover, American Express, check, or U.S. company purchase order. Payment must be received before the registration is confirmed.
Make all checks payable to Software Quality Engineering. You will receive a confirmation email upon payment by check, credit card, or company purchase order. Payment must be received at Software
Quality Engineering on or before September 12, 2014, to take advantage of the Early Bird conference rates listed above.
HOTEL RESERVATIONS
Take advantage of the discounted conference rate at the Disneyland Hotel. To make a reservation, visit http://vlt.me/sw14hotel or call 714.520.5005 and mention you are a STARWEST attendee to receive
your discount. Cancellations on a guaranteed reservation must occur more than five days prior to the specified arrival time to ensure a refund. If you need special facilities or services, please specify at the time
of reservation.
CANCELLATION POLICY
Conference registrations cancelled after September 22, 2014 are subject to a 20% cancellation fee. No cancellations or refunds may be made after September 29, 2014. Substitutions may be made at any time before
the first day of the program. Call the Client Support Group at 904.278.0524 or 888.268.8770 to obtain a cancellation code. All valid cancellations require a cancellation code.
SATISFACTION GUARANTEE
Software Quality Engineering is proud to offer a 100% satisfaction guarantee. If we are unable to satisfy you, we will gladly refund your registration fee in full.
MEDIA RELEASE
From time to time we use photographs, video, and audio of conference participants in our promotional and publishing materials. By virtue of your attendance at the STARWEST conference, you
acknowledge that Software Quality Engineering, Inc. reserves the right to use your likeness in such materials.
*Your registration includes a digital subscription to Better Software magazine.
CALL
888.268.8770
OR
904.278.0524
TO
REGISTER
•
S TA R W E S T. T E C H W E L L . C O M
31
PRESORTED
STANDARD
U.S. POSTAGE
PAID
GAINESVILLE, FL
PERMIT NO. 726
340 Corporate Way, Suite 300
Orange Park, FL 32073
IF ADDRESSEE IS NO LONGER EMPLOYED:
Re-route to Director of Software Development
Want to Go Green?
Email us at [email protected] with “Green” in the subject line to change your preferences to receive email communications only.
35
16
Br eaking S oftware
OCTOBER 12–17, 2014
ANAHEIM, CA
D ISN E YLA ND H OTEL
starwest.techwell.com
www.sqe.com/stareast
OCTOBER 15–16
THE EXPO
Visit Top Industry Providers
Offering the Latest in Testing
Solutions
TOOLS • SERVICES
TECHNIQUES • DEMOS
Register by September 12
and save up to $200!
98% of 2013
Attendees
Recommend
STARWEST to
Others in the
Industry
FRIDAY, OCTOBER 17
Testing & Quality
Leadership Summit
S O F T WA R E T E S T I N G A N A LY S I S & R E V I E W