Throughout Monday’s caucuses in Iowa, according to press reports, precinct chairs across the state struggled to use a hastily-built and inadequately tested mobile app, developed by a shadowy for-profit tech company, to report results to the Iowa Democratic Party. This seems to have been the main cause of massive delays in the publishing of results from the caucuses.

When the media learned last month that the Iowa Democratic Party planned to use a mobile app to report caucus results, the party refused to reveal many details about the app. It didn’t publish the app’s source code for independent security researchers to inspect nor give any information about how thoroughly the app had been tested (apparently, not very thoroughly). The party wouldn’t even name the vendor that it hired to develop the app (Shadow, Inc.), claiming that doing so could inadvertently help potential cyber attackers.

Elected officials couldn’t get answers, either. The office of Sen. Ron Wyden asked the Democratic National Committee for details about the app three times in lead-up to the Iowa caucuses, but the requests were ignored, according to the Wall Street Journal. Wyden is himself a Democrat, representing Oregon.

This is the opposite of what the Iowa Democratic Party should have done.

Hiding the details of how a computer system works does nothing to make it more secure. This is known as “security through obscurity,” and it provides a false sense of security, while making it harder for people to have confidence that the system actually works as expected.

Election systems should instead rely on the information security principle of “open design.” The National Institute of Standards, the federal agency responsible for recommending standards that industry and government agencies should follow, lists open design as an important principle for designing secure computer systems. “System security should not depend on secrecy of the implementation or its components,” NIST’s Guide to General Server Security says.

This open design practice is commonplace in the software industry, particularly in systems that handle very sensitive data. The Signal app, for example, is widely known as one of the best designed end-to-end encrypted messaging apps. Unlike the Iowa caucus reporting app:

  • Signal’s source code is freely available on the internet for anyone to inspect. You can find the Android source code in this repository on GitHub, the iPhone source code in this one, and the desktop app source code in this one.
  • The inner workings of Signal’s encryption algorithm are publicly documented, and the implementation has been peer-reviewed.

While it’s possible that cyberattackers could use this wealth of information about how the app works to find vulnerabilities, the benefits of open design by far outweigh the risks. When flaws are inevitably found, they are more likely to get fixed rather than to be quietly exploited by attackers, and the software ecosystem as a whole improves because of it. And, perhaps most importantly, open design gives users confidence in the security of the app without having to blindly trust the claims of the developers.

The Iowa caucus reporting app was not used to actually cast any votes but rather to more quickly deliver the results of these votes to the state party. Still, election systems are large and complex. If we insist on using technology for any parts of them — which we do, and probably will continue to do — it’s important that every part is as transparent as possible.

All election-related software, whether it’s registering voters, casting votes, delivering election results to a central database, or anything else, should use open design principles.

Here’s what the Iowa Democratic Party should have done (and what everyone else in the business of running elections should do) to ensure that election software is as secure and reliable as possible, and that voters could have confidence in its use:

  • Commit to total transparency about the entire process, which definitely includes which vendor they choose to develop election-related software, and how they came to that decision.
  • Make sure well before the election that the software designs and implementation, including the full source code and documentation, is freely available to the public so that experts can investigate it for issues.
  • Hire professionals to conduct third-party security audits, fix any issues found, and publish the results of these audits. (This is common practice for companies that take security seriously; the whistleblower submission system SecureDrop recently published the results of a security audit, and so did the VPN provider TunnelBear).
  • Welcome outside security research, work with university research teams, and run a bug bounty program where individuals can get paid for disclosing vulnerabilities they discover.
  • Conduct a test run or low-stakes trial of the software using real end users.
  • All of the above strongly implies providing much more time than the two months reportedly given to the team behind the Shadow app to get it up and running in Iowa.

Even this won’t ensure election security. “No steps can guarantee security in this day and age,” Douglas Jones, an associate professor of computer science at the University of Iowa, told The Intercept. “Therefore, we need to build our entire framework for the use of technology in such a way that we don’t depend on it to be secure.” This, at least, the Iowa Democratic Party did correctly. When the reporting app failed, it was able to fall back to paper records, albeit with extensive delays.

Jones also believes that new technology should never be deployed for the first time in a high-stakes national election. “New voting technologies should be used first in small local elections where the stakes are low,” he said, adding that “the best time to deploy new voting machines in the U.S. is right after November of an even-numbered year.”