Dan Tokaji's Blog
Professor Dan Tokaji
Election reform, the Voting Rights Act, the Help America Vote Act, and related topics -- with special attention to the voting rights of people of color, non-English proficient citizens, and people with disabilities

Dan Tokaji's Blog Links Publications & Working Papers
Equal Vote
Monday, June 27
 
The DNC's Voting Machine Findings
Last week, the Democratic National Committee issued a lengthy report on the 2004 election in Ohio, entitled "Democracy at Risk." The report makes for difficult reading, largely because different parts are written by different authors, who each employ their own formats and modes of exposition. This results in some schizophrenia, as the different parts of the report aren't coordinated with one another, and they vary considerably in quality.

This is particularly true of the two sections of the report dealing with voting technology. The first is an empirical study by Walter R. Mebane, Jr. and Michael C. Herron. Their analysis finds no evidence to support claims that widespread fraud systematically misallocated votes from Kerry to Bush. Included in this portion of the report is what seems to be a thorough analysis of the performance of different types of voting equipment used in Ohio's 2004 election: punch cards, central-count optical scan, precinct-count optical scan, and direct record electronic (DRE) systems. This section is rough sledding for the casual reader, but includes some interesting findings.

Like prior studies, the report finds that there were more overvotes and undervotes (collectively known as "residual votes") in places using punch card ballots. The median uncounted vote rate of punch cards was 1.64%, compared to less than 1% for the other types of equipment. Mebane & Herron find that the residual vote rate was higher in polling places with fewer machines. This makes sense, since those polling places are more likely to be crowded, and voters are thus less likely to check their work carefully.

The report also identifies a number of "outlier" precincts, ones in which there were substantially more residual votes than in others using the same "kind" of voting technology. I put "kind" in parentheses because, although the report considers DRE voting machines all of one "kind," there are different models in use. Franklin County (Columbus area), for example, uses older "full-face" DREs, while Mahoning County (Youngstown area) uses newer "touchscreen" DREs. The report finds a large number of "outlier" precincts in Franklin County with high residual votes, which I suspect is because of the older machines used there -- as well as the long lines which plagued polls in that county, due to the inadequate number of machines provided.

There's only one county in Ohio reported as having used a precinct-count optical scan system. That county is Allen, a small- to mid-sized county (less than 50,000 votes), which had a relatively low residual vote rate. The reported median was 0.76%, which approaches the lower boundary of what's feasible, given that some voters in every election intentionally choose not to cast a vote for President.

This thorough empirical analysis contrasts with the other section on voting technology, written by Dan S. Wallach. This is the weakest part of the DNC report, providing no new information about Ohio's 2004 election -- but instead reflecting, it would appear, the preconceptions of its author. Wallach was one of the co-authors of the report on the code used in Diebold machines, which galvanized opposition to electronic voting, but didn't take into consideration the actual circumstances in which the voting system was implemented. (See here for more on the shortcomings of that report. ) As should be obvious, one can't intelligently analyze or compare voting systems without carefully examining how the technology is being implemented.

Unfortunately, Wallach's report for the DNC almost completely disregards the facts regarding Ohio's implementation of electronic voting. He doesn't, for example, discuss any of the measures that Ohio has already has taken to promote DRE security. One might well conclude that these steps are inadequate, but a serious report would at least discuss them.

Instead, Wallach's section begins with the dubious assertion that "many studies [of residual vote rates] have concluded that the new DRE voting systems are less accurate than more traditional optical scan ballots," and a bit later says "[m]any studies of residual voting rates compared to voting technologies, including the DNC's study of Ohio, have shown that the lowest residual votes occur with precinct-count optical scan systems." There are no citations to authority anywhere in Wallach's section of the report, so it's hard to say what studies he's referring to.

The evidence that does exist belies Wallach's unsupported assertions. A report by Charles Stewart of MIT found that, between 2000 and 2004, the steepest drop in residual vote rates occurred in counties switching from punch cards to DREs (1.46%); this drop was bigger than in counties that moved to optical scan (1.12%). Counties that went from optical scan to DREs saw a decrease in their residual vote rate (1.26%).

There are two other problems with Wallach's claims. First, there are hardly any studies I know of which disaggregate the performance of central-count and precinct-count optical scan systems. Second, despite Wallach's assertion regarding the performance of "new" DREs, most of the available evidence doesn't separate older full-face models from newer touchscreen models. It's therefore misleading at best to suggest that the evidence to support the conclusion that precinct-count optical scans do better than newer electronic machines -- although there is evidence that at least some older DREs had higher residual vote rates.

The only study I know of to disaggregate the different types of optical scan and DRE machines is this one by political scientists David Kimball and Martha Kropf. That study finds that precinct-count optical scan and newer DREs had the same residual vote rates in 2004 (0.8%) nationwide. Older DRE machines performed slightly worse (1.2%), with central count optical scans, punch cards, and hand counted paper ballots all performing significantly worse (1.5%, 1.8%, and 2.0% respectively).

The Ohio statistics set forth in Mebane & Herron's section don't really support Wallach's assertions, since they include only one precinct-count optical scan county (Allen), and a relatively small one at that, and they don't disaggregate older and newer DRE systems. Wallach also fails to mention research finding that electronic voting reduces the racial gap in uncounted votes, but that the evidence on precinct-count optical scan is more ambiguous (see this study by Michael Tomz & Robert P. Van Houweling).

Wallach proceeds to endorse the requirement that electronic voting systems be required to produce a contemporaneous paper record (aka, "voter verified paper audit trail") of the electronic vote. Not only does he ignore the practical problems with such a device; incredibly, he doesn't note that Ohio already has a law requiring that voting machines generate a contemporaneous paper record. Wallach also gives short shrift to the disability access issue, endorsing a hybrid DRE/optical scan system that wouldn't allow people with visual and manual dexterity impairments to vote without assistance.

Instead of dealing with the evidence, Wallach parrots the arguments that advocates of the contemporaneous paper record have been making, without seriously addressing any of the opposing arguments. Most remarkably, this section of the report doesn't really talk about what happened in Ohio at all, including the headaches which have arisen as a direct result of the legislature's decision to mandate a contemporaneous paper record, discussed here.

In short, Wallach's section of the report is great for those interested in arguments unencumbered by the facts; not so great for those who are interested in taking a serious look at how voting technology has performed, both in Ohio and elsewhere, and how it might be improved. It would be a shame if the other sections of the report, which appear to reflect a more careful analysis of what actually happened in Ohio, were tarnished by the slipshod character of this section.

Powered by Blogger Site Meter


Moritz College of Law The Ohio State University