Why Johnny Can't Vote

By Barbara Simons

Barbara Simons
Barbara Simons
In the 2004 US election about 30% of the electorate voted on paperless computerized voting machines. The lack of an audit trail for these machines combined with discrepancies between exit polls and tabulated results 1 raised questions in some people's minds about the accuracy of the tabulated results.2, 3 Because there is no way to conduct a meaningful recount for paperless voting machine, it is impossible to verify that the reported results are correct. This is not a healthy situation for a democracy.

As a result of Florida 2000, some people concluded that paper ballots simply couldn't be counted, even though businesses, banks, racetracks, lottery systems, and other entities in our society count and deal with paper all the time. Instead, paperless computerized voting systems (Direct Recording Electronic or DREs) were touted as the solution to "the Florida problem." Replacing hanging chads with 21st century technology, proponents claimed, would result in accurate election counts and machines that would be impossible to rig. Furthermore, computerized voting systems could report results shortly after the polls close.

Many election officials loved the idea, believing the new machines would be cheaper and more reliable than the old systems. Also, the lack of recounts meant that election workers could go home early on Election Day. Vendor enthusiasm was enhanced by the almost $4 billion of US government money that was promised in the Help America Vote Act (HAVA), passed in 2002. Yet now, voter verified paper trails are being demanded by numerous public interest groups, computing professionals, and members of Congress. Where did things go wrong?

Electronic voting machine software is proprietary, the certification testing process is both secret and inadequate, and the test results are secret. For years, prominent computer security experts have been warning that paperless DRE machines present major security problems, including buggy software and the risk of malicious code changing the outcome of an election. But these experts were largely ignored until Stanford professor David Dill created a petition4 calling for voter verified audit trails for voting systems. The core idea behind the Dill petition is that the voters should be able to verify that their ballots have been correctly recorded; also, it should be possible to conduct a meaningful recount.

Because of the secrecy surrounding almost every aspect of e-voting—along with a lack of public national incident reporting—independent computing technologists can provide only limited analyses of problems relating to hardware, software, testing, security, and human factors. Nonetheless, evidence of these problems is widespread and varied.

For example, voting machines deployed in Carteret County, North Carolina for the 2004 election had a storage capacity for only 3005 ballots. When these machines were used for early voting by a large number of people, 4438 votes were irretrievably lost. Because only 2287 votes separated the Republican and Democratic candidates for state Agricultural Commissioner, the State Board of Elections ruled that a revote for Agricultural Commissioner be held in Carteret County only. After the courts struck down that decision, the Board of Elections called for a statewide revote. That, too, was struck down, and the Board, which is bitterly divided, was ordered to resolve the election some other way. As of this writing, the election of Agricultural Commissioner has not been resolved.

A case study in incompetence
Diebold, which has been manufacturing ATMs for years and is one of the major DRE vendors, has become the poster child of all that is wrong with DREs. Diebold's involvement with voting machines received significant national press when the CEO of Diebold, Walden O'Dell, stated in an August 14, 2003 letter to Central Ohio Republicans that he was "committed to helping Ohio deliver its electoral votes to the President next year."

However, the PR problem triggered by O'Dell's statement pales in comparison to the security breach uncovered by Bev Harris5, who announced in February, 2003 that she had discovered Diebold voting machine software on an open FTP website. Computer science professors Avi Rubin and Dan Wallach, and their students Tadayoshi Kohno and Adam Stubblefield, subsequently analyzed some of that software and published a security analysis in a paper that is sometimes referred to as the "Hopkins paper"6. One of the more shocking revelations was that Diebold used a single DES key (F2654hD4) to encrypt all of the data on a storage device. Consequently, an attacker with access to the source code would have the ability to modify voting and auditing records. Perhaps even more surprising, Diebold had been warned in 1997 about their sloppy key management by Douglas Jones, a professor of computer science at the University of Iowa and a member of the Iowa Board of Examiners for Voting Machines and Electronic Voting Equipment. 7

Because of the security issues raised in the Hopkins paper, the State of Maryland, which had just committed to purchasing Diebold DREs, commissioned a study of Diebold machines by Science Applications International Corporation (SAIC). The SAIC report8 is a very fast read, since only about 1/3 of it was made public—the rest was redacted. But even the limited amount of information that was released in the report is quite damning. For example, the report states that the Diebold system is so complicated that even if all of the problems were fixed, there still could be security risks because of poorly trained election officials.

Section 5 of the report, which "provides the risk assessment findings, including a discussion of the SBE security requirements, threats to the implementation of the AccuVote-TS, likelihood of exploitation of the threat, vulnerabilities, and mitigation strategies and recommendations for improving the security posture" is completely redacted.9 Even the name of the operating system being used is redacted (page 17). However, we know from internal Diebold emails that Diebold was running Windows CE 3.0, an operating systems kit that allows different versions to be assembled for different embedded applications. The certification process, which does not examine Commercial Off the Shelf Software (COTS), treats Windows CE as a COTS system. [Note: the reason this comment is important is that it demonstrates that the Windows CE software has never been examined by the testing authorities.]

In November 2003, the Maryland Department of Legislative Services commissioned yet another study of Diebold machines by RABA Technologies.10 The Trusted Agent report, released in January 2004, revealed physical security problems such as the use of identical keys on security panels covering PCMCIA and other sockets on the machines—as well as locks that could be picked in a few seconds.

Meanwhile, the State of Ohio, which had been considering the purchase of Diebold DREs for the entire state, hired Compuware to test hardware and software,and InfoSentry to conduct a security assessment.11 On January 12, 2005, Ohio Secretary of State J. Kenneth Blackwell announced that precinct-based optical scan voting systems (defined below), manufactured by Diebold or ES&S, will be offered to county boards of elections as the state's primary voting system.11 The Compuware study uncovered yet another hardwired password, this time involving the supervisor's card, used to start up each voting machine on Election Day as well as to terminate the voting process at the end of the day.

How did such flawed machines become certified? The first FEC standard for electronic voting machines, issued in 1990, was replaced in 2002. 12 Many voting systems in use today were certified to the 1990 standards. Voting machines are tested and certified by three private companies, referred to as Independent Testing Authorities (ITAs). The ITAs are certified by the National Association of State Election Directors, but are not subjected to any government oversight. Vendors pay for all testing.

States typically are provided with a one-page certificate saying that the software satisfied the FEC standards. By contrast, vendors are given detailed test results. Some states request the test results, but results have been provided only when the states or election officials sign non-disclosure agreements. Not only should test results all be made public, but there also should be a central data depository that collects all test results and problem incidents from voting machines—much as is done for airplanes-so that the government and election officials can check to make sure that all known problems have been rectified.

Rather than checking all software for security flaws and attacking the software to see if it can be compromised, the ITAs limit their tests to items required by the FEC standards. For example, the 2002 FEC standards call for "effective password management," but the phrase is not defined. We can infer from the Diebold results that no one is checking to see if encryption keys have been hardwired into the code. An obvious approach for dealing with buggy or malicious code would be to require that all voting software be made public, thereby exposing it to more eyes and increasing the likelihood of bug detection.13 But there is still the risk that software running on the voting machines may not be identical to the software that is made public. Further, it is possible to write a compiler that will insert malicious code into object code.14

Even open source code can be vulnerable. A recent attempt to insert a two-lines-of-code backdoor into Linux was caught by some observant programmers.15 But, the fact that this particular backdoor attempt was stymied is no guarantee that some equally subtle future attempt will also be detected.

Because there is no audit trail and no independent check for paperless DREs, it is critical that the machines be securely stored so that no one can access the machines and modify the software. The insecurity of DREs makes them especially vulnerable during the process of delivering and storing them in voting sites. Increased security and testing costs are a significant hidden expense of paperless DREs.

E-Voting technologies
Voting systems on the market today can be divided into those that use screens or monitors and those that do not. Because they contain computers, screen-based systems can be equipped with earphones and various devices, typically hand-held, that allow voters with vision impairments to vote independently. Computerized machines are programmed to prevent voters from selecting too many candidates (overvotes), alert voters when they have omitted votes (undervotes), and permit voters to review their ballots before submitting them (second chance voting). Screen-based systems can be subdivided into systems that produce voter verified paper ballots and those that do not. A voter verified paper ballot (VVPB) is a piece of paper containing all of the voter's selections. [Note: we need to include the def of VVPB here, because it's referred to below.] Because it' s impossible to know whether or not computerized voting machines currently on the market correctly store and count the ballots, the creation of a paper ballot allows the voter to confirm that his or her choices have been correctly recorded.16

DREs.
The major manufacturers of paperless DREs are Diebold, Sequoia, ES&S, and Hart InterCivic. Several of the DREs, most of which use touch screens as inputs, are being retrofitted by the manufacturers to produce VVPBs.17 But these retrofits can themselves introduce new problems. For example, Sequoia has added a printer that prints the ballots consecutively on a roll of paper, leading to concerns among many that voter privacy could be negatively impacted by tracking the order in which people vote on various machines.

Other DREs, namely those manufactured by AccuPoll and Avante, were initially designed to produce VVPBs. Avante also manufactures an optical scan model that prints optical scan ballots that sighted voters can mark, as well as an "accessible" optical voting system that allows vision-impaired voters to print out optical scan ballots marked to reflect their choices.

Optical scan systems. Optical scan voting systems, which are less expensive and do not entail the same security risks as DREs, typically require the voter to mark his or her ballot, in much the same way that a student taking a standardized test uses a number two pencil to make computer-readable marks.

Precinct-based optical scan systems require the voter to "test" his or her ballot by submitting it to the scanner and having the scanner notify the voter if the ballot contains overvotes or appears to be blank. Ideally, at the end of Election Day all of the ballots are initially tallied at the precinct, and the ballots, together with the results, are sent to the tabulation center.

Hybrid models. Ballot marking systems are a cross between DREs and optical scan systems. The AutoMARK, manufactured by Vogue Election Systems and currently marketed by ES&S, offers a touch screen like a DRE. After inserting a blank optical scan ballot into the back of the machine, the voter enters his or her choices via the touchscreen. When the voter has finished voting, the machine marks the optical scan ballot, thereby eliminating the problem of stray pencil marks that could otherwise confuse the scanner.

Another system, produced by Populex, includes a screen with an attached stylus. The system prints out a completed ballot once the voter has finished voting. The ballot uses numbers to represent voter choices, along with a corresponding bar code for the optical scanner. Both the AutoMARK and the Populex systems have attached headphones that allow blind voters to vote independently.

Cryptographic voting systems. VoteHere18 and David Chaum19 have developed voting systems that provide an encrypted receipt that voters can use to verify that their ballots has been accurately counted. Chaum's system is not currently being manufactured. A problem common to both systems is that they offer no way to conduct a recount should it be determined that a ballot tabulation problem has occurred, although individual ballots can be corrected. Also, neither scheme is particularly easy for voters to understand.

Conclusion
The issue of e-voting should have been primarily a technological issue—one involving computer security, human factors, reliability, and efficiency. Instead, because of the vast sums of money involved, e-voting has been heavily politicized.

Election officials were told that DREs in the long run would be cheaper than alternative voting systems. They were told that DREs had been extensively tested and that the certification process guaranteed that the machines were reliable and secure. No mention was made of the significant costs of testing and of secure storage of DREs; no mention was made of the inadequacy of the testing and certification processes, to say nothing of the difficulty of creating bug-free software.

Technologists are attempting to educate election officials, policy makers, and the public about the risks of paperless DREs. It is critical for the continued existence of democracy throughout the world that we succeed.

Barbara Simon is retired from IBM Research. She is a former President of the Association for Computing Machinery (ACM), founder and former Chair of the US Public Policy Committee of ACM (USACM), and current chair of USACM's Committee on Voting.

Footnotes
1 http://www.appliedresearch.us/sf/epdiscrep

2 A report released on January 19, 2005 by the consortium of six news organizations that had commissioned the new polling system claimed that "in a number of precincts a higher than average within- precinct error [was] most likely due to Kerry voters participating in the exit polls at a higher rate than Bush voters".

3 A response to the consortium report can be found at http://uscountvotes.org/ucvAnalysis/US/USCountVotes_Re_Mitofsky-Edison.pdf.

4 http://www.verifiedvoting.org/index.asp

5 http://www.scoop.co.nz/mason/stories/HL0302/S00036.htm

6 http://avirubin.com/vote/analysis/index.html

7 See http://www.cs.uiowa.edu/~jones/voting/dieboldftp.html for an overview of the Diebold story.

8 http://www.dbm.maryland.gov/DBM%20Taxanomy/Technology/Policies%20&%20Publications/State%20V oting%20System%20Report/stateVotingSystemReport.html

9 The description of Section 5 is on p. 2. It probably was supposed to have been redacted, since the title of Section 5 is redacted in the Table of Contents.

10 http://www.raba.com/tex/press/TA_Report_AccuVote.pdf

11 12 http://www.eac.gov/election_resources/vss.html

13 The Open Voting Consortium (OVC) (http://www.openvotingconsortium.org/) is a non-profit group of software engineers and computer scientists working on an open source system that will produce a voter- verified paper ballot. Their work has been stymied because of lack of funding.

14 Ken Thompson's Turing Award speech "Reflections on Trusting Trust" http://www.acm.org/classics/sep95

15 http://kerneltrap.org/node/view/1584

16 It's possible that some non-paper or paper-like technology will be developed in the future. But for now the only options available involve paper. Furthermore, paper has the advantages of being both cheap and something that the average voter understands.

17 A major incentive is that California Secretary of State Kevin Shelley has mandated that all voting machines that are deployed in California must produce an Accessible Voter Verifiable Paper Audit Trail by January, 2006.

18 http://www.votehere.net

19 http://www.seas.gwu.edu/~poorvi/Chaum/chaum.pdf

©1995 - 2024, AMERICAN PHYSICAL SOCIETY
APS encourages the redistribution of the materials included in this newspaper provided that attribution to the source is noted and the materials are not truncated or changed.

Editor: Alan Chodos
Associate Editor: Jennifer Ouellette