E-Vote Guidelines Need Work
By Kim Zetter Wired News Jul. 07, 2005
In an effort to keep pace with changing technology and address widespread security concerns about electronic voting machines, the federal government has released new guidelines for voting systems.
The guidelines, published in late June, call for vendors to follow better programming practices and make some suggestions for addressing problems with vote integrity.
Computer security experts say the guidelines are a step in the right direction, but fall short of making voting systems secure. They also don't require systems to produce a voter-verified paper audit trail, which would allow voters to confirm their vote.
The government is accepting public comment on the guidelines for 90 days, after which it will revise them, if needed, and release them for states to adopt. But there has been some confusion on whether these should be considered final guidelines, or simply a first step toward more permanent guidelines.
Avi Rubin, a Johns Hopkins University computer science professor and technical director of the university's Information Security Institute, said the new guidelines are an improvement but contain some serious security red flags.
He also said they have some requirements that, had they been included in previous versions of voting system guidelines, would have prevented voting systems made by Diebold Election Systems from being certified.
"One problem with the Diebold code was that it had large, complex multi-logic statements with no comments (from the designers)," Rubin said. "That wouldn't pass this standard."
Comments are a standard programming convention, where software designers include plain-text comments in the software code that can help anyone reading the code track changes to the software and understand what function specific lines of code perform.
Rubin was part of a group of computer scientists who examined source code for the Diebold system in 2003 and found, in addition to a number of security problems, that the code didn't follow basic programming conventions indicating that the programmers were unskilled and had few or no quality-control procedures in place. Rubin's findings prompted voting activists to call for the Diebold system to be decertified and helped launch a voting-machine reform movement that included a demand for paper audit trails.
The new guidelines were created by the Technical Guidelines Development Committee, headed by the acting director of the National Institutes of Standards and Technology, and composed of election officials and people with varying technical abilities. The committee created the guidelines for the U.S. Election Assistance Commission, a new federal entity that Congress created after the election problems in Florida in 2000 to improve the integrity and efficiency of elections.
Voting system guidelines, or standards, are not mandatory. States can choose to adopt the standards and require vendors that sell voting systems in their state to adhere to them. Currently, 38 states and the District of Columbia require voting systems used in the state to meet the standards in whole or in part.
The new guidelines previous sets created in 1990 and revised in 2002, which many computer scientists considered inadequate because they failed to address certain security issues or to establish good software-development practices and testing procedures.
Those criticisms haven't been satisfied by the new standards, Rubin said.
One concern is the use of commercial off-the-shelf software, or COTS. Voting vendors that use COTS in their machines such as Diebold, which uses a Windows operating system in its touch-screen voting system don't have to pass the off-the-shelf software through testing and certification procedures.
Rubin was also disappointed to see that the guidelines don't prohibit the use of telecommunications in election systems. Many electronic voting systems have modems that allow election officials to connect the machines to a phone line to transmit election results. Rubin and other computer scientists say this could allow someone to hack the machine.
"All of the standard threats that would come up (when you have) a networked system, they didn't address," Rubin said.
That's primarily because there are no perfect solutions for protecting voting machines from attack if they're networked in such a way, he said.
"Given that we can't do anything about them, you shouldn't use telecommunications in voting," Rubin said. "But they don't seem to be willing to take a stand against things that are really insecure."
Rubin also thought the testing requirements should include real attack tests conducted by design red teams that would try to find ways to hack the system. "Red team" is a term used to describe testers who attempt to break into a system to test its security vulnerabilities. When a red-team test was performed on the Diebold system after Rubin's report came out in 2003, the team found they could hack the system easily in several ways tests the lab that certified the machines didn't perform.
Rubin recently launched a private company with three other computer security experts that will conduct such tests. Rubin said, however, that his company will only test voting equipment pro bono in order to remove any potential conflict of interest with his academic work on voting system security.
Sanford Morganstein, president of Populex, a company that makes a voting machine that produces a paper ballot card, was disappointed that the guidelines don't require a paper audit trail, but instead leave it to states to decide whether to make one mandatory. Currently, 20 states require their voting machines to have a voter-verified paper audit trail.
"There are about two or three of us (vendors who) believe strongly in the paper trail," Morganstein said. "We think democracy depends on the will and the confidence of the voters. And we think paper trails enhance confidence."
"They would definitely win over the respect of (the) computer science community if they did verified voting," Rubin said. "But they would have a mutiny on their hands from (some of the) vendors and election officials. They're trying to walk a middle ground. But I don't think that's necessary for them to do. I think they should do what's right."
Ron Rivest, a computer scientist at MIT and one of the founders of RSA Security, was a member of the committee that created the guidelines. He said the committee was simply following guidance from the Election Assistance Commission.
"I think the committee had guidance from EAC that what they wanted to see was language specifying how to use a (voter-verified paper audit trail) if a state would choose to do so, rather than trying to say whether VVPAT should be demanded."
To that end, they provided guidelines for the VVPAT. Rivest said that given the time frame the committee had to create the guidelines which are supposed to be in place by 2006 he considered the standards a big improvement over previous ones, precisely because they include language about a paper audit trail.
But he said no one should consider these the final word on voting system standards. He said the voting system guidelines are a work in progress.
"If anybody tries to interpret these as a final product ... they may be disappointed," Rivest said. "But they may be happy to see which direction they're going in. We didn't have time to do a comprehensive job on security. It's a set of first steps. The security work has just begun. I encourage people to send in comments."
Rivest said the telecommunications issue would likely be among the security issues that his group revisits when they embark on the next version of the standards. Both he and Mat Heyman, spokesman for NIST, said they understood that the standards were meant to be a preliminary version that would be put in place for the 2006 elections.
"I think the (committee) members would expect there would be other revisions out in time to be relevant to the 2008 elections," Rivest said.
But according to EAC spokeswoman Jeannie Layson, the guidelines released for public comment are meant to be permanent guidelines, once the EAC releases a final version following the public comment period.
"Certainly as technology evolves, we will continue working with NIST and the TGDC to amend them as necessary," Layson said. "But as far as I know, this is a permanent set of guidelines."