WavebreakmediaMicro - Fotolia

Iowa caucus app fiasco a cautionary tale for all enterprises

App testing is crucial to ensuring that mobile apps are ready for mission-critical systems such as the mobile app that failed during the Iowa caucuses.

The need for proper application testing looms larger than ever following the disastrous rollout of mobile software meant to support one of the most important early stages of the U.S. presidential election.

Several days after the close of the Democratic Party's Iowa caucuses this week, there was still no official winner, thanks to a shoddily designed, poorly tested mobile app for reporting the results of the caucus voting.

With the fate of the results of the first significant contest in the presidential primary race in the balance, the app failed, and failed miserably.

Its performance was slow or nonresponsive. Some users couldn't log in. Others received error messages they could not understand, and still others couldn't even download it. Caucus workers had to wrangle paper ballots and phone in their results to headquarters, which complicated and slowed the reporting process.

The software, dubbed IowaReporterApp, was created by Washington, D.C.-based Shadow, which according to various media accounts rushed it into use at the caucuses after just weeks of development and minimal testing.

Testing is key for any app, but especially for an event-based app like this, which was to handle reporting the results of an election. Indeed, many apps and services are built for specific events, where you only get one chance to get it right.

For instance, the Super Bowl happened just a day before the Iowa caucuses.

"Every bit of software supporting the Super Bowl only had that one chance," said John Ruberto, a senior QA project manager at Testlio, a software testing and QA company based in Palo Alto, Calif. "[The] kickoff wasn't going to wait for a bug fix. Tax software has to be ready for a rush of returns on April 15. The IRS will not wait. Likewise, the apps used to manage election results only have that one day to get it right."

Shadow declined a request for comment and pointed to a statement from CEO Gerard Niemira posted on its website. "We sincerely regret the delay in the reporting of the results of last night's Iowa caucuses and the uncertainty it has caused to the candidates, their campaigns and Democratic caucus-goers," it read in part. "We will apply the lessons learned in the future, and have already corrected the underlying technology issue."

Functional testing
The IowaReporterApp at the center of the Democratic caucus fiasco wasn't put through the testing paces thoroughly enough.

Painful lessons

For event-based software, you have to work out the kinks beforehand, then manage the configuration and prevent regression errors.

The situation with the Iowa caucuses app could have been avoided with more planning and testing built into the development cycle.

Tax software has to be ready for a rush of returns on April 15. The IRS will not wait. Likewise, the apps used to manage election results only have that one day to get it right.
John RubertoSr. QA project manager, Testlio

"For example, we follow a framework for building a test strategy around live events," Ruberto said. "This framework includes a full-scale rehearsal test that generally happens several weeks before the event. In other words, the planning for testing occurs well before the software is ready."

Moreover, irrespective of inadequate testing, Shadow bypassed app stores as a method of distributing the app and instead distributed IowaReporterApp through TestFlight and TestFairy, which are vehicles that mobile developers use for beta testing their apps, not to push them out for production. Moreover, screenshots of the app attained by Motherboard indicate that Shadow used the freemium version of TestFairy, rather than pay a little extra for the enterprise edition.

Proliferating an app like this requires "good design, best practices in development, trained personnel, testing both during development and at scale with the release model, and external validation testing," said Gene Spafford, a prominent computer security expert and computer science professor at Purdue University.

Unlike the shoddy development used to build IowaReporterApp, there are lots of processes that are well known in software engineering that are applied in developing critical software that almost always get it right, Spafford said. Examples include avionics, nuclear power and medical device software.

"It costs money and time to properly staff for an effort like this, build artifacts correctly in advance of need, test them and then properly train potential users," he said. "That is why so much of the software we use generally is so bad -- companies want to cut corners to keep expenses low."

Get some hired help

Shadow might have avoided the problems that befell them by hiring a software testing company such as Testlio, QASource or QualityLogic.

Joe Walker, general manager of test operations at Boise, Idaho-based QualityLogic said his team would have laid out a full test plan, including assessing the target devices and OSes for the app. Then it would perform functionality and compatibility testing, followed by load testing.

"This app was obviously going to be interacting with a variety of different APIs," said Paul Morris, QualityLogic's engineering manager and test lab manager. "So, one of the things that we typically do is create load test scripts that will exercise the APIs with the payloads that are anticipated to be generated by the apps."

Testing those APIs is key, as the APIs enable the mobile app to access the back-end infrastructure that drives the application.

"There are the three things that you need to ensure with APIs -- they need to be reliable, they need to be secure and they need to be scalable," said Mark Lambert, vice president of products and services at Parasoft, a maker of software testing tools in Monrovia, Calif.

"That's a problem associated with UI-centric testing practices -- just focusing on the UI/UX doesn't really make sure you're covered for when the application deploys at scale," he said, noting that part of the Shadow app's problems appeared to be scaling.

Let the tools do the trick

Perhaps the use of proper tooling could have benefited the Shadow app developers, Lambert suggested.

The software testing tools market continues to grow at a rapid clip as software development teams of all sizes realize the importance of testing their applications. Gartner estimated that the market for test automation software currently stands at $2.3 billion, but because not all testing platforms -- such as test clouds -- are counted in this assessment, that number is on the low end. However, the market for overall software testing services will reach nearly $30 billion by 2023, according to Markets Report World.

There are many appropriate tools for building and testing something like IowaReporterApp, said Thomas Murphy, an analyst at Gartner.

Given the app's likely technology stack, it would be normal to use Selenium for functional validation, and to also use something such as JMeter to ensure the system worked with the expected number of users, he said.

"You would need a device cloud to test across different mobile devices, so that means things like Sauce Labs, Bitbar and Experitest, so that you can ensure you have the right user experience on different devices expected to be used," he said.

A developer also would probably do a "dry run" or two making use of crowdsourcing services such asuTest, Murphy said.

"Those runs would happen at the end of key feature sprints," he said. "Every build, you would be looking at code quality and security via something like SonarQube or Semmle. So, it is an overall application of practices designed to drive and ensure quality through the application lifecycle."

Yet, even with the availability of tools, testing can be a critical step that is often overlooked or shortchanged because of deadline or schedule pressure.

Skimp on app testing at your peril

The Iowa caucus flap has made some software testing vendors eager to promote their own agenda, albeit from a reasonable standpoint. "The most critical step is to shift left and start thinking about testing, security and reliability from the very moment you start thinking about building your application," said Matt Wyman, SVP of product at Sauce Labs, a San Francisco-based mobile and web test cloud provider.

Doing so enables dev teams to quickly identify and fix bugs and avoid costly delays that occur when issues are found at the end of the release cycle, or worse, after an app has already been pushed into production, which is what happened in Iowa. By the time the bug was discovered, it was already too late. The process, and more importantly, the public's confidence in it, had already been damaged.

Still, some observers, such as Purdue's Spafford, said the best way to avoid problems such as the Iowa fiasco is to not use an app at all. But others disagree.

"The takeaway here shouldn't be that this is why you can't trust digital applications for matters of public importance," Wyman said. "I think the opposite is true. ... We should use what happened in Iowa as an opportunity to make the necessary strategic commitments to testing, security and reliability. It's a digital world, and that means digital confidence is the new public trust for every organization."

Dig Deeper on Software testing tools and techniques