At the 2014 StarEast Software Testing Conference, Agile prevailed. This year, DevOps—a cooperative effort among developers, QA, and operational IT people—-was more often talked about. Nevertheless, as expressed in one of this year’s closing short “Lightning Keynotes,” producing more and better quality software at a faster rate is what it’s all about.
The role of the tester within an organization was the focus of Keith Klain’s opening address. Klain, executive director and head of Software Quality Management for Tekmark Global Solutions, has an enterprise technology background, having been in charge of global software testing for Barclay’s. In such a large company, he said, the goals of the business managers are distinct from those of the testers. There needs to be more attention given to the value proposition of software testing. Instead of a “train and pray” culture, Klain suggested that enterprises need to have a direct link from the business objectives to the supporting technology and then to what employees actually do. In Klain’s opinion, testers should have the business-oriented vocabulary necessary to discuss their role with the enterprise’s nontechnical managers.
The second keynote, delivered by David Dang, vice president of Automation Solutions at Zenergy Technologies, was a fast-moving trip onto the second wave of test automation software. In contrast to a small number of packaged automation solutions making up the first wave, Dang said, today there are at least 136 open-source test automation frameworks to choose among. The first-wave tools were difficult to use, fragmented, and required additional instrumentation code.
Selenium Web Browser is a leading second wave framework that has been adopted by many companies and is used to automate web browser testing—a growing requirement as more and more applications become web-based. Nevertheless, Dang stressed that to successfully use open-source tools, an organization generally needs more technical resources and greater technical knowledge. You must truly understand how the framework operates, Dang said, because you can’t count on vendor support. And, open source tools require more time and effort to maintain.
Technical sessions
One technical presentation highlighted the wide range of test activities necessary to ensure the quality of the Anki Overdrive game—a very high-tech update to slot cars. Jane Fraser, test director at the company, discussed some of the hardware and software challenges faced by her team of 16 testers.
Up to four cars compete on a customizable track layout. The track is encoded so that infrared cameras under each car read the car’s position and transmit it to a smartphone hosting the game. Separate smartphones are used by competitors to control their cars’ speed, direction, and weapons—yes, you can shoot (virtually) at a competing car. For iPhones, Bluetooth low energy (BTLE) handles the communications. Fraser commented that Android devices didn’t implement BTLE consistently, so for Android, Anki uses Wi-Fi instead.
A very important aspect is the cloud storage of each session, which allows Fraser to virtually replay any game that has shown a problem. Charles Proxy is a tool she uses to view all of the traffic between the game and the cloud. She stressed the importance of testing one thing at a time and also that you can’t ignore one-off problems. Anki’s emphasis on quality, which includes logging and the development of related test tools, helped convince Fraser to join the company.
She listed a few of her concerns as a tester: How can I make this fail? How will the customer use the game? What happens as a car’s battery runs down? The multiple communications channels make test sufficiently complicated that the game has been used as a BTLE network stress test, but several times Fraser referred to the “thing” (as in Internet of Things) that adds another layer of complexity. For example, when a car would very infrequently fly off the track, the cause eventually was found to be sunlight reflected up into the IR camera—a sun shield fixed that. And, status data continuously received from each car allows the host to increase drive to the motor as the battery discharges.
A session presented by Arondekar Gauli from InfoStretch was more commercially oriented, highlighting the benefits of the company’s QMetry automation framework. InfoStretch provided an end-to-end testing strategy for Peloton Interactive featuring a large degree of automated testing via the company’s QMetry Test Manager. This facilitated Peloton’s goal of accelerating software testing while ensuring quality.
Peloton provides both an exercise bike and in-home cycle training with the slogan “Indoor cycling: reimagined for the home.” While you ride your sensor-equipped Peloton bike, you are joined (virtually) on a 22-inch screen by trainers riding their Peloton bikes in a remote studio. They monitor your biometric information and suggest improvements to your riding technique.
At another session, “Test-Driven Everything—With Deliberate Collaboration,” Jeff “Cheezy” Morgan from LeanDog and Ardita Karaj from EPAM Systems conducted a live demo of a web application revision. Karaj took the roles of both the project owner and tester as she developed test stories and acceptance criteria. For several years, Morgan has helped teams adopt Cucumber—a tool written in Ruby that runs automated acceptance tests. So, he wrote the lower-level code that then was incrementally tested by the Cucumber statements.
Well, that’s mostly how things worked out, but in the spirit of collaboration, the developer, tester, and owner roles all contributed as the changes progressed. As the presenters explained, acceptance criteria are only meaningful if all three groups have a shared understanding of what the team is building. The idea, they said, is to test in small pieces as the application is being coded, and the automation tools facilitate this, making continuous development easier.
The StarWest Software Testing Conference will be held Oct. 2-7 at the Disneyland Hotel in Anaheim, CA.