Go to UPA home page The U P A voice

December 2007 Contents


UPA Job Bank

UPA 2007

 

New Guidelines Proposed for US Voting Systems

By Whitney Quesenbery

Whitney is a user research and usability consultant. She's also a past-president of UPA, an appointed member of the TGDC (and chair of the subcommittee on Human Factors and Privacy), and represents UPA on the Access Board advisory committee working to refresh the US "Section 508" accessibility regulations.

On October 31, 2007, the US Election Assistance Commission (EAC) launched the first of two public comment periods on the draft voluntary voting system guidelines (VVSG) prepared by EAC's Technical Guidelines Development Committee (TGDC).

This version is the result of nearly two years' work by the TGDC. The draft guidelines are a complete re-write of the current VVSG 2005, addressing the next generation of voting systems. This draft contains new and expanded material in the areas of reliability and quality, usability and accessibility, security and testing. You can read the guidelines and submit comments at the EAC website. [1]

In a significant improvement in voting system security, the guidelines require software independence for all voting systems. According to the TGDC draft guidelines, software independence can be achieved through the use of independent voter verifiable records (IVVR). [2]

The Usability, Accessibility, and Privacy chapter [3] also includes new performance requirements, tested through the Voter Performance Protocol.

"Usability is defined generally as a measure of the effectiveness, efficiency, and satisfaction achieved by a specified set of users with a given product in the performance of specified tasks. In the context of voting, the primary user is the voterů
"Additional requirements for task performance are independence and privacy: the voter should normally be able to complete the voting task without assistance from others, and the votes should be private. Lack of independence or privacy may adversely affect effectiveness (e.g., by possibly inhibiting the voter's free choice) and efficiency (e.g., by slowing down the process).
"General usability is covered by both high-level performance-based requirements (in this section) and design requirements (in following sections). Whereas the latter require the presence of specific features generally thought to promote usability, the former directly address metrics for effectiveness (e.g., correct capture of voter selections), efficiency (e.g., time taken to vote), and satisfaction. The voting system is tested by having groups of people (representing voters) attempt to perform various typical voting tasks. The requirement is met only if those tasks are accomplished with a specified degree of success." [Section 3.2.1]

This may be the first time a standard has included detailed performance benchmarks, and created a specific usability test protocol with required benchmark values for conformance. The draft VVSG defines three benchmarks for accuracy (effectiveness):

  • Total Completion Score - the proportion of users who successfully cast a ballot (whether or not the ballot contains erroneous votes). Failure to cast a ballot might involve problems such as a voter simply "giving up" during the voting session because of an inability to operate the system, or a mistaken belief that one has successfully operated the casting mechanism.
  • Perfect Ballot Index - the ratio of the number of cast ballots containing no erroneous votes to the number of cast ballots containing one or more errors (either a vote for an unintended choice, or a missing vote).
  • Voter Inclusion Index - a measure of both voting accuracy and consistency. It is based on mean accuracy and the associated standard deviation. Accuracy per voter depends on how many "voting opportunities" within each ballot are performed correctly. A low value for the standard deviation of these individual accuracy scores indicates higher consistency of performance across voters.

A system must achieve passing values for each of these benchmarks to be certified for use. The Voter Performance Protocol also measures efficiency and confidence (satisfaction). These measures are reported, but not used to determine conformance to the guidelines. The TGDC felt that although these values are important (for example, election officials might consider them when comparing different systems), they should not be used to fail a system. A new interaction design, for example, might improve accuracy, but add to the time-on-task. You can find complete details about the Voter Performance Protocol and how it was developed in a white paper, "Usability Performance Benchmarks for the VVSG." [4]

Several UPA members participated in creating these guidelines:

  • Whitney Quesenbery is an appointed member of the TGDC and chair of the subcommittee on Human Factors and Privacy.
  • Sharon Laskowski leads the human factors and privacy staff at the National Institute of Standards and Technology (NIST), overseeing both the drafting of the guidelines and research that contributed to their development.
  • Janice (Ginny) Redish, Redish & Associates, conducted research for NIST on writing usable messages and instructions.
  • Bill Killam and his associates at User Centered Design conducted research and pilot studies for NIST to develop the Voter Performance Protocol.

The Road to Adoption

Any new regulation must go through public comment before it is adopted, and the VVSG will have two public review periods. In between, according to an EAC press release, "We will review each and every comment that is submitted," and may revise the guidelines based on public input. The EAC will also hold public meetings to discuss the proposed guidelines and allow public input and involvement. The full process to the adoption of the new guidelines is:

  1. The EAC submits the TGDC's draft document to the Federal Register and launches the first public comment phase, which will last for 120 days. This phase began November 1, 2007, and will run through February 2008.
  2. The EAC will collect and review all public comments submitted on the TGDC draft. After consideration of all public comments, the EAC will perform an internal review.
  3. Based upon public comment and internal review of the TGDC document, the EAC will develop and publish its draft version in the Federal Register. The public will have another 120 days to comment on the EAC draft version. The EAC will conduct public hearings about its draft version.
  4. The EAC will collect and review all comments submitted and make final modifications. The final version of the VVSG will be adopted by vote of the Commission at a public meeting and then published in the Federal Register.

Links

[1] TGDC Recommended Guidelines (and online comment tool): http://www.eac.gov/vvsg

[2] Chapter 4: Security and Audit Architecture: http://www.eac.gov/vvsg/part1/chapter04.php

[3] Chapter 3: Usability, Accessibility, and Privacy Requirements: http://www.eac.gov/vvsg/part1/chapter03.php/

[4] Human Factors and Privacy Subcommittee of the TGDC, "Usability Performance Benchmarks for the VVSG," August 2007. Available from http://vote.nist.gov/meeting-08172007/Usability-Benchmarks-081707.pdf.

 

Usability Professionals' Association
promoting usability concepts and techniques worldwide
Phone + 1.630.980.4997         office@upassoc.org

Contact the Voice