Your browser does not appear to support Javascript, please update your browser or contact your system administrator to enable Javascript on your Internet browser. Thank you. Chapter 6: General Core Requirements — U.S. Election Assistance Commission
Skip to content

U.S. Election Assistance Commission

Personal tools
You are here: Home TGDC Recommended Guidelines Part 1: Equipment Requirements Chapter 6: General Core Requirements
TGDC Recommended
Guidelines

VVSG Navigation
 

Chapter 6: General Core Requirements

6.1 General Design Requirements

Note: The ballot counter requirements from [VVSG2005] have been converted into functional requirements (Part 1: 4.3.5 “Ballot counter”).

6.1-A No obvious fraud

Voting systems SHALL contain no logic or functionality that cannot be justified in terms of a required system function or characteristic.

Applies To: Voting system

Test Reference: Part 3: 4.3 “Verification of Design Requirements”, 4.5.2 “Security”

Source: New requirement

6.1-B Verifiably correct vote recording and tabulation

The vote recording and tabulation logic in a voting system SHALL be verifiably correct.

Applies To: Voting system

Test Reference: Part 3: 4.6 ”Logic Verification”

DISCUSSION

The key word in this requirement is "verifiably." If a voting system is designed in such a way that it cannot be shown to count votes correctly despite full access to its designs, source code, etc., then it does not satisfy this requirement.

Source: New requirement

6.1-C Voting system, minimum devices included

Voting systems SHALL contain at least one EMS and at least one vote-capture device.

Applies To: Voting system

Test Reference: Part 3: 4.2 “Physical Configuration Audit”

DISCUSSION

All voting systems must be capable of election definition, vote collection, counting and reporting. To accomplish this requires at least one EMS and at least one vote-capture device.

Source: Clarification of [VSS2002]

6.1-D Paper ballots, separate data from metadata

Paper ballots used by paper-based voting devices SHALL meet the following standards:

  1. Marks that identify the unique ballot style SHALL be outside the area in which votes are recorded, so as to minimize the likelihood that these marks will be mistaken for vote responses and the likelihood that recorded votes will obliterate these marks; and
  2. If alignment marks are used to locate the vote response fields on the ballot, these marks SHALL be outside the area in which votes are recorded, so as to minimize the likelihood that these marks will be mistaken for vote responses and the likelihood that recorded votes will obliterate these marks.

Applies To: Paper-based device

Test Reference: Part 3: 4.3 “Verification of Design Requirements”

DISCUSSION

See also Requirement Part 2:4.5.4.2-B.

Source: [VSS2002] I.3.2.4.2.1

6.1-E Card holder

A frame or fixture for printed ballot cards is optional. However, if such a device is provided, it SHALL:

  1. Position the card properly; and
  2. Hold the ballot card securely in its proper location and orientation for voting.

Applies To: MMPB

Test Reference: Part 3: 4.3 “Verification of Design Requirements”

Source: [VSS2002] I.3.2.4.2.5

6.1-F Ballot boxes

Ballot boxes and ballot transfer boxes, which serve as secure containers for the storage and transportation of voted ballots, SHALL:

  1. Provide specific points where ballots are inserted, with all other points on the box constructed in a manner that prevents ballot insertion; and
  2. If needed, contain separate compartments for the segregation of ballots that may require special handling or processing.

Applies To: Paper-based device

Test Reference: Part 3: 4.3 “Verification of Design Requirements”

DISCUSSION

Requirement Part 1: 6.1-F.B should be understood in the context of Requirement Part 1:7.5.3-A.18, Requirement Part 1: 7.7.3-A and Requirement Part 1: 7.7.3-B. The differing options in how to handle separable ballots mean that separate compartments might not be required.

Source: [VSS2002] I.3.2.4.2.6

6.1-G Vote-capture device activity indicator

Programmed vote-capture devices SHALL include an audible or visible indicator to provide the status of each voting device to election judges. This indicator SHALL:

  1. Indicate whether the device is in polls-opened or polls-closed state; and
  2. Indicate whether a voting session is in progress.

Applies To: Vote-capture device Λ Programmed device

Test Reference: Part 3: 4.3 “Verification of Design Requirements”

DISCUSSION

Polls-closed could be broken down into pre-voting and post-voting states as in Part 1: 8.2 “Vote-Capture Device State Model (informative)” or further divided into separate states for not-yet-tested, testing, ready/not ready (broken), and reporting.

Source: Clarified from [VSS2002] I.2.5.1.c and I.3.2.4.3.1

6.1-H Precinct devices operation

Precinct tabulators and vote-capture devices SHALL be designed for operation in any enclosed facility ordinarily used as a polling place.

Applies To: Precinct tabulator, Vote-capture device

Test Reference: Part 3: 4.3 “Verification of Design Requirements”

Source: [VSS2002] I.3.2.2.1 / [VVSG2005] I.4.1.2.1

6.2 Voting Variations

The purpose of this formulaic requirement is to clarify that support for a given voting variation cannot be asserted at the system level unless device-level support is present. It is not necessarily the case that every device in the system would support every voting variation claimed at the system level; e.g., vote-capture devices used for in-person voting may have nothing in common with the vote-capture devices (typically MMPB) used for absentee voting. However, sufficient devices must be present to enable satisfaction of the system-level claim.

6.2-A System composition

Systems of the X class SHALL gather votes using vote-capture devices of the X device class, count votes using tabulators of the X device class, and perform election management tasks using an EMS of the X device class, where X is any of the voting variations (In-person voting, Absentee Voting, Review-required ballots, Write-ins, Split precincts, Straight party voting, Cross-Party Endorsement, Ballot Rotation, Primary Elections, Closed Primaries, Open Primaries, Provisional-Challenged Ballots, Cumulative Voting, N-of-M Voting, and Ranked Order Voting).

Applies To: In-person voting, Absentee voting, Review-required ballots, Write-ins, Split precincts, Straight party voting, Cross-party endorsement, Ballot rotation, Primary elections, Closed primaries, Open primaries, Provisional-challenged ballots, Cumulative voting, N-of-M voting, Ranked order voting

Test Reference: Part 3: 4.2 “Physical Configuration Audit”

DISCUSSION

If the voting system requires that absentee ballots be counted manually, then it does not conform to the absentee voting class. However, it may conform to the review-required ballots class.

If the voting system requires the allocation of write-in votes to specific candidates to be performed manually, then it does not conform to the write-ins class. However, it may conform to the review-required ballots class.

If the voting system requires that provisional/challenged ballots be counted manually, then it does not conform to the provisional-challenged ballots class. However, it may conform to the review-required ballots class.

Source: Conformance ramifications of system/device relationship

6.3 Hardware and Software Performance, General Requirements

This section contains requirements for hardware and software performance:

6.3.1 Reliability

The following sections provide the background and rationale for the reliability benchmarks appearing in Part 1: 6.3.1.5 “Requirements”. Given that there is no "typical" volume or "typical" configuration of voting system with such diversity among the many jurisdictions, it is nevertheless necessary to base the benchmarks on some rough estimates in order that they may be in the correct order of magnitude, albeit not optimal for every case.

6.3.1.1 Classes of equipment

Because different classes of voting devices are used in different ways in elections, the kinds of volume against which their reliability is measured and the specific reliability that is required of them are different. The classes of voting devices for which estimates are provided are listed below. Please refer to the definitions of the parenthesized terms in Appendix A.

6.3.1.2 Estimated volume per election

The "typical" volumes described below are the volumes that medium-sized jurisdictions in western states need their equipment to handle in a high turn-out election, as of 2006. A county of 150 000 registered voters will have 120 000 ballots cast in a presidential election. A typical polling place will be set up to handle 2000 voters, which equals 60 polling places in a mid-sized county.
Central-count optical scanner: Medium-sized jurisdictions in western states need their central count equipment to scan 120 000 ballots in an election. Depending upon the actual throughput speeds of the scanners, they use 2 to 8 machines to handle the volume. "Typical" volume for a single scanner is the maximum tabulation rate that the manufacturer declares for the equipment times 8 hours.

Election Management System: The volume equals the total number of interactions with the vote gathering equipment required by the design configuration of the voting system to collect the election results from all the vote-capture devices.

The typical constant across the systems is that the Election Management System will interact once with each polling place for each class of equipment. Assuming our "typical" county with 60 polling places, one or more DREs in each polling place, and one or more optical scan devices, that totals 2×60=120 transactions per election.

The primary differences in the central count EMS environment are whether the optical scan devices are networked with the EMS or function independently.

In the networked environment, the device will interact with the EMS once per batch (typically around 250 ballots). So, 120 000/250=480 interactions.

In the non-networked environment, the results are handled similar to the polling place uploads. Results are copied off to media and uploaded to the EMS. Since central counting typically occurs over several days – especially in a vote-by-mail environment – the test should include several uploads from each scanner. 2 scanners × 4 days = 8 uploads.

To simplify these different cases to a single benchmark, we use the highest of the volumes (480 transactions), which leads to the lowest failure rate benchmark.

Precinct-count optical scanner: Polling place equipment has a maximum number of paper ballots that can be handled before the outtake bins fill up. Usually around 2500.

Direct Recording Electronic: Typical ballot takes 3–5 minutes to vote, so the most a single DRE should be expected to handle are 150–200 voters in a 12 hour election day.

Electronically-assisted Ballot Marker: Typically takes longer to vote than with a DRE. An individual unit should not be expected to handle more than 70 voters on election day.

Ballot activator: The volume use of these devices match the volumes for the polling place, which in our assumed county is 2000/polling place. Our assumed county would have 10–14 DREs/polling place with around 20 tokens. Each token would be used about 100 times.

Audit device: No information available.

The estimated volumes are summarized in Part 1: Table 6-1 . The estimates for PCOS and CCOS have been generalized to cover precinct tabulator and central tabulator respectively, and a default volume based on the higher of the available estimates has been supplied for other vote-capture devices that may appear in the future. Audit devices are assumed to be comparable to activation devices in the numbers that are deployed.

Table 6-1 Estimated volumes per election by device class

Device class

Estimated volume per device per election

Estimated volume per election

central tabulator

Maximum tabulation rate times 8 hours

120 000 ballots

EMS

480 transactions

480 transactions

precinct tabulator

2000 ballots

120 000 ballots

DRE

200 voting sessions

120 000 voting sessions

EBM

70 voting sessions

120 000 voting sessions

other vote-capture device

200 voting sessions

120 000 voting sessions

activation device

2000 ballot activations

120 000 ballot activations

audit device

2000 ballots

120 000 ballots

6.3.1.3 Manageable failures per election

The term failure is defined in Appendix A. In plain language, failures are equipment breakdowns, including software crashes, such that continued use without service or replacement is worrisome to impossible. Normal, routine occurrences like running out of paper are not considered failures. Misfeeds of ballots into optical scanners are handled by a separate benchmark (Requirement Part 1: 6.3.3-A), so these are not included as failures for the general reliability benchmark.

The following estimates express what failures would be manageable for a mid-sized county in a high-turnout election. Medium-sized counties send out troubleshooters to polling places to replace or resolve problems with machines.

Any failure that results in all CVRs pertaining to a given ballot becoming unusable or that makes it impossible to determine whether or not a ballot was cast is called disenfranchisement. It is unacceptable for even one ballot to become unrecoverable or to end up in an unknown state. For example, an optical scanner that shreds a paper ballot, rendering it unreadable by human or machine, is assessed a disenfranchisement type failure; so is a DRE that is observed to "freeze," providing no evidence one way or the other whether the ballot was cast, when the voter attempts to cast the ballot.

Central-count optical scanner: No more than one machine breakdown per jurisdiction requiring repairs done by the manufacturer or highly trained personnel. Medium sized jurisdictions plan on having one backup machine for each election.

Election Management System: This is a critical system that must perform in an extremely time sensitive environment for a mid-sized county over a 3 to 4 hour period election night. Any failure during the test that requires the manufacturer or highly trained personnel to recover should disqualify the system. Otherwise, as long as the manufacturer's documentation provides usable procedures for recovering from the failures and methods to verify results and recover any potentially missing election results, 1 failure is assessed for each 10 minutes of downtime (minimum 1 – no fractional failures are assessed). A total of 3 or more such failures disqualifies the system.

Precinct-count optical scanner: A failure in this class of machine has a negligible impact on the ability of voters to vote in the polling place. No more than 1 of the machines in an election experience serious failures that would require the manufacturer or highly trained personnel to repair (e.g., will not boot). No more than 5 % of the machines in the election experience failures that require the attention of a troubleshooter/poll worker (e.g., memory card failure).

Direct Recording Electronic and Electronically-assisted Ballot Marker: No more than 1 % of the machines in an election experience failures that would require the manufacturer or highly trained personnel to repair (e.g., won't boot) and no more than 3 % of the machines in an election experience failures that require the attention of a troubleshooter (e.g., printer jams, recalibration, etc.).

Ballot activator: The media/token should not fail more than 3 % of the time (the county will provide the polling place with more tokens than necessary). No more than 1 of the devices should fail (the device will be replaced by the county troubleshooter).

Audit device: No information available. If comparable to ballot activators, there should be at least 1 spare.

The manageable failure estimates are summarized in Part 1: Table 6-2 . A "user-serviceable" failure is one that can be remedied by a troubleshooter and/or election official using only knowledge found in voting equipment user documentation; a "non-user-serviceable" failure is one that requires the manufacturer or highly trained personnel to repair.

Please note that the failures are relative to the collection of all devices of a given class, so the value 1 in the row for central tabulator means 1 failure among the 2 to 8 central tabulators that are required to count 120 000 ballots in 8 hours, not 1 failure per device.

Table 6-2 Estimated manageable failures per election by device class

Device class

Failure type

Manageable failures per election

voting device (all)

Disenfranchisement

0

central tabulator

All1

1

EMS

Non-user-serviceable

0

EMS

User-serviceable (10 minutes)

2

precinct tabulator

Non-user-serviceable

1

precinct tabulator

User-serviceable

5 % of devices = 3

DRE

Non-user-serviceable

1 % of devices = 6

DRE

User-serviceable

3 % of devices = 18

EBM

Non-user-serviceable

1 % of devices = 17

EBM

User-serviceable

3 % of devices = 51

other vote-capture device

Non-user-serviceable

1 % of devices = 6

other vote-capture device

User-serviceable

3 % of devices = 18

activation device

Media/token

3 % of tokens = 36

activation device

Main unit

1

audit device

All

1

Apart from misfeeds, which are handled by a separate benchmark, TGDC experience is that central tabulator failures are never user-serviceable.

6.3.1.4 Derivation of benchmarks

We focus on one class of device and one type of failure at a time, and we assume that each failure is followed by repair or replacement of the affected device. This means that we consider two failures of the same device to be equivalent to one failure each of two different devices of the same class. The sense of "X % of the machines fail" is thus approximated by a simple failure count, which is X/100 times the number of devices. This then must be related to the total volume processed by the entire group of devices over the course of an election in order to determine the number of failures that would be manageable in an election of that size.

To reduce the likelihood of an unmanageable situation to an acceptably low level, a benchmark is needed such that the probability of occurrence of an unmanageable number of failures for the total volume estimated is "acceptably low." That "acceptably low level" is here defined to be a probability of no more than 1 %, except in the case of disenfranchisement, where the only acceptable probability is 0.

Under the simplifying assumption that failures occur randomly and in a Poisson distribution, the probability of observing n or less failures for volume v and failure rate r is the value of the Poisson cumulative distribution function,

Poisson cumulative distribution function

Consequently, given ve (the estimated total volume) and ne (the maximum manageable number of failures for volume ve), the desired benchmark rate rb is found by solving P(ne,rbve)=0.99 for rb. This sets the benchmark rate such that there remains a 1 % risk that a greater number of failures would occur with marginally conforming devices during an election in which they collectively process volume ve. In the case of disenfranchisement, that risk is unacceptable; hence the benchmark is simply set to zero.

6.3.1.5 Requirements

6.3.1.5-A Failure rate benchmark

All devices SHALL achieve failure rates not exceeding those indicated in Part 1: Table 6-3.

Applies To: Voting device

Test Reference: Part 3: 5.3.2 “Critical values”

Source: Revised from [VSS2002] I.3.4.3 / [VVSG2005] I.4.3.3

Table 6-3 Failure rate benchmarks

Device class

Failure type

Unit of volume

Benchmark

voting device (all)

Disenfranchisement

 

0

central tabulator

All

ballot

1.237×10−6

EMS

Non-user-serviceable

transaction

2.093×10−5

EMS

User-serviceable (10 minutes)

transaction

9.084×10−4

precinct tabulator

Non-user-serviceable

ballot

1.237×10−6

precinct tabulator

User-serviceable

ballot

6.860×10−6

DRE

Non-user-serviceable

voting session

1.941×10−5

DRE

User-serviceable

voting session

8.621×10−5

EBM

Non-user-serviceable

voting session

8.013×10−5

EBM

User-serviceable

voting session

3.058×10−4

other vote-capture device

Non-user-serviceable

voting session

1.941×10−5

other vote-capture device

User-serviceable

voting session

8.621×10−5

activation device

Media/token

ballot activation

2.027×10−4

activation device

Main unit

ballot activation

1.237×10−6

audit device

All

ballot

1.237×10−6

6.3.1.5-B No single point of failure

All systems SHALL protect against a single point of failure that would prevent further voting at the polling place.

Applies To: Voting system

Test Reference: Part 3: 4.3 “Verification of Design Requirements”

Source: [VSS2002] I.2.2.4.1.a / [VVSG2005] I.2.1.4.a

6.3.1.5-C Protect against failure of input and storage devices

All systems SHALL withstand, without loss of data, the failure of any data input or storage device.

Applies To: Voting system

Test Reference: Part 3: 4.3 “Verification of Design Requirements”

Source: [VSS2002] I.2.2.4.1.e / [VVSG2005] I.2.1.4.e

6.3.2 Accuracy/error rate

Since accuracy is measured at the system level, it is not necessary to define different benchmarks for different classes of devices.

6.3.2-A Satisfy integrity constraints

All systems SHALL satisfy the constraints in Part 1: 8.3 “Logic Model (normative)”.

Applies To: Voting system

Test Reference: Part 3: 4.6 “Logic Verification”

Source: Formalization of general requirements

6.3.2-B End-to-End accuracy benchmark

All systems SHALL achieve a report total error rate of no more than 8×10–6 (1 / 125 000).

Applies To: Voting system

Test Reference: Part 3: 5.3.4 “Accuracy”

DISCUSSION

For the definition of report total error rate, see Requirement Part 3: 5.3.4-B.
This benchmark is derived from the "maximum acceptable error rate" used as the lower test benchmark in [VVSG2005]. That benchmark was defined as a ballot position error rate of 2×10−6 (1 / 500 000).

Given that there is no "typical" ratio of votes to ballot positions with such diversity among the many jurisdictions, it is nevertheless necessary to base the benchmark on some rough estimates in order that it may be in the correct order of magnitude, albeit not optimal for every case. The rough estimates are as follows. In a presidential election, there will be approximately 20 contests with a vote for 1 on each ballot with an average of 4 candidates, including the write-in position, per contest. (Some states will have fewer contests and some more. A few contests, like President, would have 8–13 candidates; most have 3 candidates including the Write-in, and a few have 2 candidates.) The estimated ratio of votes to ballot positions is thus ¼.

For paper-based tabulators, this general requirement is elaborated in Part 1: 7.7.5 “Accuracy”.

Source: Generalized and clarified from [VSS2002] I.3.2.1 / [VVSG2005] I.4.1.1

Other accuracy-related requirements include Requirement Part 1: 6.4.1.7-D, Requirement Part 1: 7.1-E, Requirement Part 1: 7.1-F, Requirement Part 1: 7.5.4-A, and Requirement Part 1: 7.8.3.1-B.

6.3.3 Misfeed rate

6.3.3-A Misfeed rate benchmark

The misfeed rate SHALL NOT exceed 0.002 (1 / 500).

Applies To: Paper-based device Λ Tabulator, EBM

Test Reference: Part 3: 5.3.5 “Misfeed rate”

DISCUSSION

Multiple feeds, misfeeds (jams), and rejections of ballots that meet all manufacturer specifications are all treated collectively as "misfeeds" for benchmarking purposes; i.e., only a single count is maintained.

Source: Merge of [VSS2002] I.3.2.5.1.4.b and I.3.2.5.2.c, reset benchmark

6.3.4 Electromagnetic Compatibility (EMC) immunity

The International Electrotechnical Commission (IEC) Technical Committee 77 on Electromagnetic Compatibility has defined [ISO95a] the concept of “ports” as the interface of an electronic device (“apparatus”) with its electrical and electromagnetic environment, as illustrated in Part 1: Figure 6-1. In the sketch, the arrows point toward the apparatus, but in a complete assessment of the compatibility, one should also consider the other direction – that is, what disturbances (“emissions”) can the apparatus inject into its environment.

Figure 6-1 Electrical and electromagnetic environment

Electrical and electromagnetic environment

Five of these ports involve conducted disturbances carried by metallic conductors, and the sixth, the “enclosure,” allows radiated disturbances to impinge on the apparatus. In this context, the term “enclosure” should not be understood as limited to a physical entity (metallic, non metallic, totally enclosed or with openings) but rather be understood as simply the route whereby electromagnetic radiations couple with the circuitry and components of the apparatus.

In previous voting systems guidelines, possible interactions and immunity concerns have been described but perhaps not in explicit terms relating them to the concept of ports. In this updated version of the VVSG, the recitation of compatibility requirements is structured by considering the ports one at a time, plus some consideration of a possible interaction between ports:

    1. Power port – also described as “power supply” – via ordinary receptacles of the polling place
    2. Earth port – implied in the National Electric Code [NFPA05] stipulations for dealing with the power supply of the polling place
    3. Signal port – connection to the landline telephone of the polling place to the central tabulator
    4. Control port – inter-system connections such as voting station to precinct tabulator
    5. Enclosure port – considerations on immunity to radiated disturbances and electrostatic discharge
    6. Interaction between signal port and power port during surge events

    Note: In this EMC section, the specified voltage and current levels are expressed in root mean square (rms) for power-frequency parameters and in peak value for surges and impulses.

    6.3.4.2 Steady-state conditions

    Adequate operation of an eventual surge-protective device and, more important, safety considerations demand that the power supply receptacles be of the three-prong type (Line, Neutral, and Equipment Grounding Conductor). The use of a “cheater” adapter for older type receptacles with only two-blade capacity and no dependable grounding conductor should be prohibited. Details on the safety considerations are addressed in Part 1: 3.2.8.2 “Safety”.

    The requirement of using a dedicated landline telephone service should also be satisfied for polling places.

    Steady state conditions of a polling place are generally out of the control of the local jurisdiction.

    However, for a polling place to ensure reliable voting, the power supply and telephone service need to be suitable for the purpose. Compliance with the National Electrical Code [NFPA05] is assumed to be required.

    6.3.4.2-A Power supply – energy service provider

    To obtain maximum flexibility of application, the voting system SHALL be powered by a 120 V, single phase power supply, as available in polling places, derived from typical energy service providers.

    Applies To: Electronic device

    Test Reference: Part 3: 3.1 “Inspection”

    DISCUSSION

    It is assumed that the AC power necessary to operate the voting system will be derived from the existing power distribution system of the facility housing the polling place. This single-phase power may be a leg of a 120/240 V single phase system, or a leg of a 120/208 V three-phase system, at a frequency of 60 Hz, according to the limits defined in [ANSI06], and premises wiring compliant with the [NFPA05], in particular its grounding requirements.

    Source: [NFPA05]

    6.3.4.2-B Telecommunications services provider

    To avoid compromising voting integrity (accidentally or intentionally), the telephone connection of a voting system SHALL use a dedicated line (no extensions on the same telephone number) and be compatible with the requirements of the telephone service provider.

    Applies To: Electronic device

    Test Reference: Part 3: 3.1 “Inspection”

    DISCUSSION

    Communications (upon closing of the poll) between the polling place and the central tabulator is expected to be provided exclusively by the landline network of the telephone service provider connected to the facility housing the polling place. The use of cell phone communications is specifically prohibited.

    Source: New requirement

    6.3.4.3 Conducted disturbances immunity

    As described in the introductory paragraphs of Part 1: 6.3.4 “Electromagnetic Compatibility (EMC) immunity”, several ports of the voting system are gateways to possible electromagnetic disturbances, both inbound and outbound. This section dealing with conducted disturbances immunity addresses concerns about the power port and the communications ports (a combination of the in-house communications and communications to remote tabulating facilities).

    Limitations of outbound conducted disturbances (“emissions” in EMC language) that might inject objectionable interference into the facility power distribution system or the telephone service connection are addressed in Part 1: 6.3.5 “Electromagnetic Compatibility (EMC) emission limits”.

    6.3.4.3-A Power port disturbances

    All electronic voting systems SHALL withstand conducted electrical disturbances that affect the power ports of the system.

    Applies To: Electronic device

    Test Reference: Part 3: 5.1.1.2-A

    DISCUSSION

    The power distribution system of the polling place can be expected to be affected by several types of disturbances, ranging from very brief surges (microseconds) to longer durations (milliseconds) and ultimately the possibility of a long-term outage. These are addressed in the following requirements: A.1, A.2, A.3, and A.4.

    NOTE: There are several scenarios of accidental conditions that can produce voltages far in excess of the deviations implied by [ANSI06] or [ITIC00], such as loss of a neutral conductor, commingling of distribution systems with low-voltage conductors (knocked down poles, falling tree limbs). Such an event will produce in the building massive failures of equipment other than voting systems, and be obvious to the officials conducting the polling. Hardware failure of the voting system can be expected. Fortunately, the occurrence of such events is quite rare, albeit not impossible, so that such a extreme stress should not be included in the EMC requirements nor in the regimen of national certification testing – provided that the failure mode would not result in a safety hazard.

    Source: [ANSI06], [IEEE02a], [ITIC00]

    6.3.4.3-A.1 Combination Wave

    All electronic voting systems SHALL be able to withstand, without disruption of normal operation or loss of data, a “Combination Wave” surge of 6 kV 1.2/50 µs, for high impedance power ports and 3 kA 8/20 µs, for low impedance power ports, between line and neutral terminals.

    Applies To: Electronic device

    Test Reference: Part 3: 5.1.1.2-A.1

    DISCUSSION

    The so-called “Combination Wave” has been accepted by industry as representative of surges that might occur in low-voltage AC power systems and be imposed on connected loads.

    Source: [IEEE02a]

    6.3.4.3-A.2 Ring Waves

    All electronic voting systems SHALL be able to withstand, without disruption of normal operation or loss of data, a “Ring Wave” surge with a 0.5 µs rise time and a decaying oscillation at 100 kHz with a first peak voltage of 6 kV between the line and neutral terminals, and between the line and equipment grounding conductor terminals, and also 3 kV between the neutral and equipment grounding conductor terminals.

    Applies To: Electronic device

    Test Reference: Part 3: 5.1.1.2-A.2

    DISCUSSION

    This test waveform, proposed by IEEE since 1980 [IEEE80] as a “Standard Waveform,” and more recently adopted by the IEC [ISO06c] represents common disturbances on AC power lines but it was not included in previous versions of the VVSG. It originates during disturbances of power flow within the building, an occurrence more frequent than lightning surges. It is less likely than the Combination Wave to produce hardware destruction, but high levels still can produce hardware failure.

    The “Power Quality” literature [Grebe96] and some standards [IEEE91] also cite “Decaying Ring Waves” or “Damped Oscillatory Waves” with lower frequencies but lesser amplitudes typically associated with the switching of power-factor correction capacitors. These can be significant for surge-protective device survival and possibly disruption of the operation of switched-mode power supplies. However, inclusion of the Combination Wave, the Ring Wave, and the Swells in these immunity criteria should be sufficient to ensure immunity against these lower frequency and lower amplitude decaying ring waves.

    Source: [IEEE02a]

    6.3.4.3-A.3 Electrical Fast Transient Burst

    All electronic voting systems SHALL be able to withstand, without disruption of normal operation or loss of data, a burst of repetitive fast transients with a waveform of 5/50 ns, each burst lasting 15 ms, from a 2 kV source.

    Applies To: Electronic device

    Test Reference: Part 3: 5.1.1.2-A.3

    DISCUSSION

    While the fast transients involved in this immunity requirement do not propagate very far and are not expected to travel from the energy supply provider, they can be induced within a facility if cable runs are exposed to switching disturbances in other load circuits. Unlike the preceding two disturbances that are deemed to represent possibly destructive surges, the Electrical Fast Transient (EFT) Burst has been developed to demonstrate equipment immunity to these non-destructive but disruptive transients. Their repetitive profile increases the probability that a disruption might occur when the logic circuits go through a transition. It is important to recognize that this test, which does not represent the actual environment, is one of interference immunity, not a test of withstanding energy stress.

    Source: [IEEE02a]

    6.3.4.3-A.4 Outages, sags and swells

    All electronic voting systems SHALL be able to withstand, without disruption of normal operation or loss of data, a complete loss of power lasting two hours and also a temporary overvoltage of up to 120 % of nominal system voltage lasting up to 0.5 second, and a permanent overvoltage of up to 110 % of nominal system voltage.

    Applies To: Electronic device

    Test Reference: Part 3: 5.1.1.2-A.4

    DISCUSSION

    Because the VVSG stipulates a two-hour back up, generally implemented by a floating battery pack, sag immunity is inherently ensured. However, the floating battery, unless buffered by a switch-mode power supply with inherent cut-off in case of a large swell, might not ensure inherent immunity against swells (short duration system overvoltages). The Information Technology industry has adopted a recommendation that IT equipment should be capable to operate correctly for swells reaching 120 % of the nominal system voltage with duration ranging from 3 ms to 0.5 s and permanent overvoltages up to 110 % of nominal system voltage.

    Source: [ITIC00]

    6.3.4.3-B Communications (telephone) port disturbances

    All electronic voting systems SHALL withstand conducted electrical disturbances that affect the telephone ports of the system.

    Applies To: Electronic device

    Test Reference: Part 3: 5.1.1.2-B

    DISCUSSION

    Voting equipment, by being connected to the outside service provider via premises wiring, can be exposed to a variety of electromagnetic disturbances. These have been classified as lightning-induced, power-fault induced, power contact, Electrical Fast Transient (EFT), and presence of steady-state induced voltage. Within a complex voting system installed in a polling place, there is also a possibility that the various pieces of equipment can be exposed to emissions from other piece of connected equipment. In the context of the VVSG compatibility, not only must the voting system equipment be immune to these disturbances, but also the public switched telephone network must be protected against harm originating from customer premises equipment, in this context the voting system equipment. Protection of the network is discussed in the Part 1: 6.3.5 “Electromagnetic Compatibility (EMC) emission limits”. Immunity to disturbances impinging on the voting system telephone port is addressed in the following requirements: B.1, B.2, B.3, B.4, B.5, and B.6.

    Source: [Telcordia06]

    6.3.4.3-B.1 Emissions from other connected equipment

    All elements of an electronic voting system SHALL be able to withstand the conducted emissions generated by other elements of the voting system.

    Applies To: Electronic device

    Test Reference: Part 3: 5.1.1.2-B.1

    DISCUSSION

    This requirement is an issue of inherent compatibility among the diverse elements of a voting system, not compatibility with the polling place environment or subscriber equipment other than those making up the voting system. It is understood and implemented that security requirements dictate that the voting system outgoing communications be provided by a dedicated landline telephone service excluding other subscriber terminal equipment otherwise used by entities occupying the facility when telephone communication with central tabulators is established.

    Source: [Telcordia06], [ANSI02]

    6.3.4.3-B.2 Lightning-induced disturbances

    All electronic voting systems SHALL be able to withstand, without disruption of normal operation or loss of data, the stresses induced into the telephone network by lightning events, which can propagate to the telephone port of the voting system. The necessary immunity level is 1 kV for high-impedance ports and 100 A for low-impedance ports, both with a 10/1000 µs waveshape.

    Applies To: Electronic device

    Test Reference: Part 3: 5.1.1.2-B.2

    DISCUSSION

    Lightning events (direct flashes to the network or voltages induced in the network by nearby flashes to earth) can be at the origin of voltage surges or current surges impinging upon the interface of the premises wiring with the landline network. The provision of surge protection in the Network Interface Device (primary protection NID) is not universally provided, especially in dense urban locations, therefore the immunity level of the telephone port should be demonstrated as required by the Telcordia Generic Requirements.

    Source: [Telcordia06]

    6.3.4.3-B.3 Power fault-induced disturbances

    All electronic voting systems SHALL be able to withstand, without disruption of normal operation or loss of data, the stresses induced into the network by power faults occurring in adjacent power distribution systems. The necessary immunity level is 600 V at 1 A for a 1 s application.

    Applies To: Electronic device

    Test Reference: Part 3: 5.1.1.2-B.3

    DISCUSSION

    For overhead telephone landline cables that share the pole with power distribution cables (medium-voltage as well as low-voltage), as well as direct burial of adjacent telephone and power cables, large power system faults can induce significant voltages and the resulting currents in the telephone network.

    Source: [Telcordia06]

    6.3.4.3-B.4 Power contact disturbances

    All electronic voting systems SHALL be able to withstand, without disruption of normal operation or loss of data, the stresses appearing at the telephone port as a result from an accidental contact between the telephone network cables and nearby power distribution cables. The necessary immunity level between ground and the T/R conductors at 60 Hz is 600 V for short durations and 277 V for indefinite durations.

    Applies To: Electronic device

    Test Reference: Part 3: 5.1.1.2-B.4

    DISCUSSION

    Outside of the polling place building, accidental contact between the telephone network cables and power distribution cables (sharing poles for overhead, or sharing trenches for underground) can inject substantial 60 Hz current and voltages into the telephone network. Within the polling place facility, while not at high probability, instances have been noted whereby contractors working in a facility can provoke a similar injection of 60 Hz current or voltage into the premises telephone wiring. The 600 V level cited in the above requirement is associated with an accidental contact with primary power lines, promptly cleared by the power system protection, while the 277 V level is associated with an accidental contact with low-voltage distribution system that might not be cleared by the power system protection.

    Source: [Telcordia06]

    6.3.4.3-B.5 Electrical Fast Transient (EFT)

    All electronic voting systems SHALL be able to withstand, without disruption of normal operation or loss of data, the disturbances associated with an EFT burst of 5/50 ns pulses, each burst lasting 15 ms, from a 0.25 kV source.

    Applies To: Electronic device

    Test Reference: Part 3: 5.1.1.2-B.5

    DISCUSSION

    Electrical Fast Transient bursts emulate the interference associated with electromagnetic coupling between the premises wiring of the telephone service and the premises wiring of the power distribution system in which switching surges can occur. Because these switching surges are random events, the occurrence of interference varies with the timing of their occurrence with respect to the transitions of the circuits. It is important to recognize that this requirement deals with interference immunity, not with withstanding energy stress. Immunity against such high-frequency coupling has been added to the requirements listed by [Telcordia06], effective January 1, 2008.

    Source: [Telcordia06], [ISO04b]

    6.3.4.3-B.6 Steady-state induced voltage

    All electronic voting systems SHALL be able to withstand, without disruption of normal operation or loss of data, the disturbances associated with steady-state induced voltages and currents. The necessary immunity level is ≥126 dBrn (50 V).

    Applies To: Electronic device

    Test Reference: Part 3: 5.1.1.2-B.6

    DISCUSSION

    Voting systems interfacing with the telephone service provider plant can be subject to the interfering effects of steady-state voltages induced from nearby power lines. Through electromagnetic coupling, normal operating currents on these power lines can induce common-mode (longitudinal) voltages and currents in the outside cable plant. The 60 Hz and 180 Hz components of the induced voltage spectrum can interfere with signaling and supervisory functions for data transmission from a polling place toward a central tabulator. Higher frequencies can produce audible noise in voice-band transmission.

    Source: [Telcordia06]

    6.3.4.3-C Interaction between power port and telephone port

    All electronic voting systems connected to both a power supply and a landline telephone system SHALL withstand the potential difference caused by the flow of surge current in the facility grounding network.

    Applies To: Electronic device

    Test Reference: Part 3: 5.1.1.2-C

    DISCUSSION

    A voting system that is powered via its power port to the power distribution system of the facility and to the telephone service provider via its telephone port can experience a potentially damaging stress between the two ports during the expected operation of the telephone network interface device in the event of a surge occurring in the telephone system. Because the level of potential differences during a surge event is principally the result of the local configuration of the premises wiring and grounding systems, and thus beyond the control of the local polling entity, inherent immunity of the voting system can be achieved by incorporating a surge reference equalizer that provides the necessary bonding between the input power port and telephone port during a surge event.

    Source: [IEEE02], [IEEE05]

    6.3.4.4 Radiated disturbances immunity

    This section discusses radiated disturbances impacting the enclosure port of the voting system, including electromagnetic fields originating from adjacent or distant sources, as well as a particular radiation associated with electrostatic discharge.

    Emissions limits requirements of radiated (and conducted) disturbances are addressed in Part 1: 6.3.5.2 ‘Radiated emissions”.

    6.3.4.4-A Electromagnetic field immunity (80 MHz to 6.0 GHz)

    All electronic voting systems SHALL withstand, without disruption of normal operation or loss of data, exposure to radiated electromagnetic fields of ≥10 V/m over the entire frequency range of 80 MHz to 6.0 GHz, and ≥30 V/m within frequency bands commonly used by portable transmitters.

    Applies To: Electronic device

    Test Reference: Part 3: 5.1.1.3-A

    DISCUSSION

    The proliferation of portable transmitters (cellular telephones and personal communications systems) used by the general population and the common communications transmitters used by security, public safety, amateur radio, and other services increases the likelihood that the voting equipment covered in the VVSG will be exposed to the radiated electromagnetic fields from these devices. Also, other wireless devices (wireless local area networks, etc.), communications and broadcast transmitters may be operating in the vicinity and need to be considered. Since it may be impractical to eliminate nearby radio-frequency sources, voting systems must demonstrate immunity to these signals in order to operate to a high standard of reliability. This requirement is intended to ensure intrinsic immunity to the electromagnetic environment.

    Source: [ANSI97], [ISO06a], [ISO06d]

    6.3.4.4-B Electromagnetic field immunity (150 kHz to 80 MHz)

    All electronic voting systems SHALL withstand, without disruption of normal operation or loss of data, exposure to radio-frequency energy induced on cables in the frequency range of 150 kHz to 80 MHz at a 10 V level.

    Applies To: Electronic device

    Test Reference: Part 3: 5.1.1.3-B

    DISCUSSION

    The dominant coupling mechanism of radiated electromagnetic fields to equipment electronics at frequencies below 80 MHz is considered to be through currents induced on interconnecting cables. At these frequencies, the wavelengths are such that typical circuit components are electrically very small and thus inefficient in coupling energy directly from the radiated electromagnetic fields. The interconnecting cables, on the other hand, tend to be on the order of the signal wavelengths and may act as efficient and possibly resonant antennas. Thus, the radiated electromagnetic fields will efficiently induce currents on these cables that are connected directly to the equipment electronics.

    Source: [ANSI97], [ISO06b]

    6.3.4.4-C Electrostatic discharge immunity

    All electronic voting systems SHALL withstand, without disruption of normal operation or loss of data, electrostatic discharges associated with human contact and contact with mobile equipment (service carts, wheelchairs, etc.).

    Applies To: Electronic device

    Test Reference: Part 3: 5.1.1.3-C

    DISCUSSION

    Electrostatic discharge events can originate from direct contact between an “intruder” (person or object) charged at a potential different from that of the units of the voting system, or from an approaching person about to touch the equipment – an “air discharge.” The resulting discharge current can induce disturbances in the circuits of the equipment.

    Note: The immunity addressed in this section is concerned with normal operations and procedures at the polling place. It does not include immunity to electrostatic discharges that might occur when service personnel open the enclosure and handle internal components.

    Source: [ANSI93], [ISO01]

    6.3.5 Electromagnetic Compatibility (EMC) emission limits

    “Emission limits” are the companion of “Immunity Requirements” – both are necessary to achieve electromagnetic compatibility. In contrast with immunity requirements that are expressed as withstand levels for the equipment, emission limits requirements are expressed as compliance with consensus-derived limits on the parameters of the disturbances injected in the electromagnetic environment by the operation of the voting system.

    6.3.5.1 Conducted emissions

    Electronic voting systems, by their nature, can generate currents or voltages that will exit via their connecting cables to the power supply or to the telephone service provider of the voting facility. To ensure compatibility, industry standards or mandatory regulations have been developed to define maximum levels of such emissions.

    6.3.5.1-A Power port connection to the facility power supply

    All electronic voting systems installed in a polling place SHALL comply with emission limits affecting the power supply connection to the energy service provider according to Federal Regulations [FCC07].

    Applies To: Electronic device

    Test Reference: Part 3: 5.1.2.1 “Conducted emissions limits”

    DISCUSSION

    The normal operation of an electronic system can produce disturbances that will travel upstream an affect the power supply system of the polling place, creating a potential deviation from the expected electromagnetic compatibility of the system. The issue is whether these actual disturbances (after possible mitigation means incorporated in the equipment) reach a significant level to exceed stipulated limits, which include the following categories:

    1. Harmonic emissions associated with the load current drawn by the voting system. However, given the low values of the current drawn by the voting system, these emissions do not represent a significant issue, as explained in [IEEE92]. They are only mentioned here for the sake of completeness in reciting the range of disturbances and therefore do not require testing.
    2. High-frequency conducted emissions (distinct from the harmonic spectrum) into the power cord by coupling from high-frequency switching or data transmission inherent to the system operation. These are addressed in the mandatory certification requirements of [FCC07], Class B.

    Source: [IEEE92], [FCC07]

    6.3.5.1-B Telephone port connection to the public network

    All electronic voting systems installed in a polling place SHALL comply with emission limits stipulated by the industry-recognized organizations of telephone service providers Telcordia [Telcordia06] and TIA [ANSI02].

    Applies To: Electronic device

    Test Reference: Part 3: 5.1.2.1-A

    DISCUSSION

    Regulatory emission limits requirements for protecting the network (public switched telephone network) from harm via customer premises equipment are contained in the source documents [Telcordia06], [ANSI02], [FCC07a] and compliance to these documents is considered mandatory for offering the equipment on the market.

    Source: [Telcordia06], [ANSI02], [FCC07a]

    6.3.5.1-C Leakage via grounding port

    All electronic voting systems installed in a polling place SHALL comply with limits of leakage currents effectively established by the trip threshold of all listed Ground Fault Current Interrupters (GFCI), if any, installed in the branch circuit supplying the voting system.

    Applies To: Electronic device

    Test Reference: Part 3: 5.1.3.2-A

    DISCUSSION

    Excessive leakage current is objectionable for two reasons:

    1. For a branch circuit or wall receptacle that could be provided with a GFCI (depending upon the wiring practice applied at the particular polling place), leakage current above the GFCI built-in trip point would cause the GFCI to trip and therefore disable the operation of the system.
    2. Should the power cord lose the connection to the equipment grounding conductor of the receptacle, a personnel hazard would occur. (Note the prohibition of “cheater” adapters in the discussion of general requirements for the polling place.)

    This requirement is related to safety considerations as discussed in Part 1: 3.2.8.2 “Safety” – in particular the requirement to have the voting system comply with [UL05].

    Note: According to [NFPA05], a bond between the equipment grounding conductor and the neutral conductor is prohibited downstream from the entrance service panel. GFCIs are designed to trip if such a prohibited bond is detected by the GFCI.

    Source: [UL06], [NFPA05]

    6.3.5.2 Radiated emissions

    6.3.5.2-A Radiated radio frequency emissions

    All electronic voting systems installed in a polling place SHALL comply with emission limits according to the Rules and Regulations of the Federal Communications Commission, Part 15, Class B [FCC07] for radiated radio-frequency emissions.

    Applies To: Electronic device

    Test Reference: Part 3: 5.1.2.2-A

    DISCUSSION

    Electronic equipment in general and modern high-speed digital electronic circuits in particular have the potential to produce unintentional radiated and conducted radio-frequency emissions over wide frequency ranges. These unintentional signals can interfere with the normal operation of other equipment, especially radio receivers, in close proximity. The requirements of [FCC07] and [ANSI06a] are intended to minimize this possible interference and control the level of unwanted radio-frequency signals in the environment.

    Source: [FCC07]

    6.3.6 Other requirements

    In addition to the requirements associated with EMC discussed in the preceding sections, there are other requirements, including dielectric withstand, personnel safety considerations (addressed in Part 1: 3.2.8.2 “Safety”) and hardware failure modes (which can also be a safety issue) [UL05].

    6.3.6.1 Dielectric withstand

    6.3.6.1-A Dielectric stresses

    All electronic voting systems SHALL be able to withstand the dielectric test stresses associated with connection to the network, characterized by limits of the admissible leakage current.

    Applies To: Electronic device

    Test Reference: Part 3:5.1.3.1-A

    DISCUSSION

    Dielectric withstand requirements stipulated by industry-consensus telephone requirements as a condition for connecting equipment to their network involve the insulation and leakage current limits between elements of the voting system hardware, including the following:

    1. Network and device or accessible circuitry which might in turn connect to the user;
    2. Network and hazardous power system; and
    3. Power equipment.

    Source: [Telcordia06]

    6.4 Workmanship

    This section contains requirements for voting system materials, and for good design and construction workmanship for software and hardware:

    • Software engineering practices;
    • Quality assurance and configuration management;
    • General build quality;
    • Durability;
    • Security and audit architectural requirements;
    • Maintainability;
    • Temperature and humidity; and
    • Equipment transportation and storage.

    6.4.1 Software engineering practices

    This section describes essential design and performance characteristics of the logic used in voting systems. The requirements of this section are intended to ensure that voting system logic is reliable, robust, testable, and maintainable.

    The general requirements of this section apply to logic used to support the entire range of voting system activities. Although this section emphasizes software, the standards described also influence hardware design considerations.

    While there is no best way to design logic, the use of outdated and ad hoc practices is a risk factor for unreliability, unmaintainability, etc. Consequently, these VVSG require the use of modern programming practices. The use of widely recognized and proven logic design methods will facilitate the analysis and testing of voting system logic.

    6.4.1.1 Scope

    The design requirements of this section apply to all application logic, regardless of the ownership of the logic or the ownership and location of the hardware on which the logic is installed or operates. Although it would be desirable for COTS software to conform to the design requirements on workmanship, its conformity to those requirements could not be assessed without access to the source code; hence, the design requirements are scoped to exclude COTS software. However, where there are functional requirements, the behaviors of COTS software and hardware are constrained. (N.B., the definition of COTS precludes any application logic from receiving a COTS designation.)

    Third-party logic, border logic, and configuration data are not required to conform to the design requirements on workmanship, but manufacturers are required to supply that source code and data to the test lab to enable a complete review of the application logic (Requirement Part 2: 3.4.7.2-E, Requirement Part 2: 3.8-D).

    6.4.1.2 Selection of programming languages

    6.4.1.2-A Acceptable programming languages

    Application logic SHALL be produced in a high-level programming language that has all of the following control constructs:

    1. Sequence;
    2. Loop with exit condition (e.g., for, while, and/or do-loops);
    3. If/Then/Else conditional;
    4. Case conditional; and
    5. Block-structured exception handling (e.g., try/throw/catch).

    Applies To: Programmed device

    Test Reference: Part 3: 4.5.1 “Workmanship”

    DISCUSSION

    The intent of this requirement is clarified in Part 1: 6.4.1.5 “Structured programming” with discussion and examples of specific programming languages.

    By excluding border logic, this requirement allows the use of assembly language for hardware-related segments, such as device controllers and handler programs. It also allows the use of an externally-imposed language for interacting with an Application Program Interface (API) or database query engine. However, the special code should be insulated from the bulk of the code, e.g. by wrapping it in callable units expressed in the prevailing language, to minimize the number of places that special code appears. C.f. [MIRA04] Rule 2.1: "Assembly language shall be encapsulated and isolated."

    Acceptable programming languages are also constrained by Requirement Part 1: 6.4.1.7-A.3 and Requirement Part 1: 6.4.1.7-A.4, which effectively prohibit the invention of new languages.

    Source: [VVSG2005] I.5.2.1, I.5.2.4 and II.5.4.1

    6.4.1.2-A.1 COTS language extensions are acceptable

    Requirement Part 1: 6.4.1.2-A MAY be satisfied by using COTS extension packages to add missing control constructs to languages that could not otherwise conform.

    Test Reference: Part 3: 4.5.1 “Workmanship”

    DISCUSSION

    For example, C99 [ISO99] does not support block-structured exception handling, but the construct can be retrofitted using (e.g.) [Sourceforge00] or another COTS package.

    The use of non-COTS extension packages or manufacturer-specific code for this purpose is not acceptable, as it would place an unreasonable burden on the test lab to verify the soundness of an unproven extension (effectively a new programming language). The package must have a proven track record of performance supporting the assertion that it would be stable and suitable for use in voting systems, just as the compiler or interpreter for the base programming language must.

    Source: Tightening of [VVSG2005] I.5.2.4 and II.5.4.1

    6.4.1.3 Selection of general coding conventions

    6.4.1.3-A Acceptable coding conventions

    Application logic SHALL adhere to a published, credible set of coding rules, conventions or standards (herein simply called "coding conventions") that enhance the workmanship, security, integrity, testability, and maintainability of applications.

    Applies To: Programmed device

    Test Reference: Part 3: 4.5.1 “Workmanship”

    DISCUSSION

    Coding conventions that are excessively specialized or simply inadequate may be rejected on the grounds that they do not enhance one or more of workmanship, security, integrity, testability, and maintainability.

    See the discussion for Requirement Part 1: 6.4.1.2-A regarding border logic.

    Source: Rewrite of [VSS2002] I.4.2.6

    6.4.1.3-A.1 Published

    Coding conventions SHALL be considered published if and only if they appear in a publicly available book, magazine, journal, or new media with analogous circulation and availability, or if they are publicly available on the Internet.

    Test Reference: Part 3: 4.5.1 “Workmanship”

    DISCUSSION

    This requirement attempts to clarify the "published, reviewed, and industry-accepted" language appearing in previous iterations of the VVSG, but the intent of the requirement is unchanged.
    Following are examples of published coding conventions (links valid as of 2007-02). These are only examples and are not necessarily the best available for the purpose.

    1. Ada: Christine Ausnit-Hood, Kent A. Johnson, Robert G. Pettit, IV, and Steven B. Opdahl, Eds., Ada 95 Quality and Style, Lecture Notes in Computer Science #1344, Springer-Verlag, 1995-06. Content available at http://www.iste.uni-stuttgart.de/ps/ada-doc/style_guide/cover.html and elsewhere.
    2. C++: Mats Henricson and Erik Nyquist, Industrial Strength C++, Prentice-Hall, 1997. Content available at http://hem.passagen.se/erinyq/industrial/.
    3. C#: "Design Guidelines for Class Library Developers," Microsoft. http://www.msdn.microsoft.com/library/default.asp?url=/library/en-us/cpgenref/html/cpconnetframeworkdesignguidelines.asp.
    4. Java: "Code Conventions for the Java™ Programming Language," Sun Microsystems. http://java.sun.com/docs/codeconv/.

    Source: Clarification of [VSS2002] I.4.2.6

    6.4.1.3-A.2 Credible

    Coding conventions SHALL be considered credible if and only if at least two different organizations with no ties to the creator of the rules or to the manufacturer seeking conformity assessment, and which are not themselves voting equipment manufacturers, independently decided to adopt them and made active use of them at some time within the three years before conformity assessment was first sought.

    Test Reference: Part 3: 4.5.1 “Workmanship”

    DISCUSSION

    This requirement attempts to clarify the "published, reviewed, and industry-accepted" language appearing in previous iterations of the VVSG, but the intent of the requirement is unchanged.

    Coding conventions evolve, and it is desirable for voting systems to be aligned with modern practices. If the "three year rule" was satisfied at the time that a system was first submitted for testing, it is considered satisfied for the purpose of subsequent reassessments of that system. However, new systems must meet the three year rule as of the time that they are first submitted for testing, even if they reuse parts of older systems.

    Source: Clarification of [VSS2002] I.4.2.6

    6.4.1.4 Software modularity and programming

    6.4.1.4-A Modularity

    Application logic SHALL be designed in a modular fashion.

    Applies To: Programmed device

    Test Reference: Part 3: 4.5.1 “Workmanship”

    DISCUSSION

    See module. The modularity rules described here apply to the component submodules of a library.

    Source: Extracted and revised from [VSS2002] I.4.2.3

    6.4.1.4-A.1 Module testability

    Each module SHALL have a specific function that can be tested and verified independently of the remainder of the code.

    Test Reference: Part 3: 4.5.1 “Workmanship”

    DISCUSSION

    In practice, some additional modules (such as library modules) may be needed to compile the module under test, but the modular construction allows the supporting modules to be replaced by special test versions that support test objectives.

    Source: Extracted and revised from [VSS2002] I.4.2.3.a

    6.4.1.4-B Module size and identification

    Modules SHALL be small and easily identifiable.

    Test Reference: Part 3: 4.5.1 “Workmanship”

    Source: Revision of [VSS2002] II.5.4.2.i, as revised by Section 6.6.4.2, Paragraph i of [P1583] and subsequent issues[5]

    6.4.1.4-B.1 Callable unit length limit

    No more than 50 % of all callable units (functions, methods, operations, subroutines, procedures, etc.) SHOULD exceed 25 lines of code in length, excluding comments, blank lines, and initializers for read-only lookup tables; no more than 5 % of all callable units SHOULD exceed 60 lines in length; and no callable units SHOULD exceed 180 lines in length.

    Test Reference: Part 3: 4.5.1 “Workmanship”

    DISCUSSION

    "Lines," in this context, are defined as executable statements or flow control statements with suitable formatting.

    Source: Revision of [VSS2002] II.5.4.2.i, as revised by Section 6.6.4.2, Paragraph i of [P1583][5]

    6.4.1.4-B.2 Lookup tables in separate files

    Read-only lookup tables longer than 25 lines SHOULD be placed in separate files from other source code if the programming language permits it. Test Reference: Part 3: 4.5.1 “Workmanship”

    6.4.1.5 Structured programming

    Note: Specific programming languages are identified to support the discussion. In no case does such identification imply recommendation or endorsement, nor does it imply that the programming languages identified are necessarily the best or only languages acceptable for voting system use.

    Table 6-4 Presence of high-level concepts of control flow in the coding conventions of earlier versions of VVSG and in various programming languages

    Concept

    VSS [GPO90]
    [VSS2002] / VVSG [VVSG2005]

    Ada [ISO87]
    [ISO95]

    C
    [ISO90]
    [ISO99]

    C++
    [ISO98]
    [ISO03a]

    C#
    [ISO03b]
    [ISO06]

    java [java05]

    Visual Basic 8 [MS05]

    Sequence

    Yes

    Yes

    Yes

    Yes

    Yes

    Yes

    Yes

    Loop with exit condition

    Yes

    Yes

    Yes

    Yes

    Yes

    Yes

    Yes

    If/Then/Else conditional

    Yes

    Yes

    Yes

    Yes

    Yes

    Yes

    Yes

    Case conditional

    Yes

    Yes

    Yes

    Yes

    Yes

    Yes

    Yes

    Named block exit

    No

    Yes

    No

    No

    No

    Yes

    No[1]

    Block-structured exception handling

    No

    Yes

    No

    Yes

    Yes

    Yes

    Yes

    The requirement to follow coding conventions serves two purposes. First, by requiring specific risk factors to be mitigated, coding conventions support integrity and maintainability of voting system logic. Second, by making the logic more transparent to a reviewer, coding conventions facilitate test lab evaluation of the logic's correctness to a level of assurance beyond that provided by operational testing.

    Prominent among the requirements addressing logical transparency is the requirement to use high-level control constructs and to refrain from using the low-level arbitrary branch (a.k.a. goto). As is reflected in Part 1: Table 6-4 , most high-level concepts for control flow were established by the time the first edition of the Guidelines was published and are supported by all of the programming languages that were examined as probable candidates for voting system use as of this iteration. However, two additional concepts have been slower to gain universal support.

    The first additional concept, called here the "named block exit," is the ability to exit a specific block from within an arbitrary number of nested blocks, as opposed to only being able to exit the innermost block, without resorting to goto. The absence of named block exit from some languages is not cause for concern here because deeply nested blocks are themselves detrimental to the transparency of logic and most coding conventions encourage restructuring them into separate callable units.

    The second additional concept, called here "block-structured exception handling," is the ability to associate exception handlers with blocks of logic, and implicitly, the presence of the exception concept in the programming language. (This simply means try/throw/catch or equivalent statements, and should not be confused with the specific implementation known as Structured Exception Handling (SEH) [Pietrek97].[2]) Unlike deeply nested blocks, exceptions cannot be eliminated by restructuring logic. "When exceptions are not used, the errors cannot be handled but their existence is not avoided." [ISO00a]

    Previous versions of VVSG required voting systems to handle such errors by some means, preferably using programming language exceptions ([VVSG2005] I.5.2.3.e), but there was no unambiguous requirement for the programming language to support exception handling. These Guidelines require programming language exceptions because without them, the programmer must check for every possible error condition in every possible location, which both obfuscates the application logic and creates a high likelihood that some or many possible errors will not be checked. Additionally, these Guidelines require block-structured exception handling because, like all unstructured programming, unstructured exception handling obfuscates logic and makes its verification by the test lab more difficult. "One of the major difficulties of conventional defensive programming is that the fault tolerance actions are inseparably bound in with the normal processing which the design is to provide. This can significantly increase design complexity and, consequently, can compromise the reliability and maintainability of the software." [Moulding89]

    Existing voting system logic implemented in programming languages that do not support block-structured exception handling can be brought into compliance either through migration to a newer programming language (most likely, a descendant of the same language that would require minimal changes) or through the use of a COTS package that retrofits block-structured exception handling onto the previous language with minimal changes. While the latter path may at first appear to be less work, it should be noted that many library functions may need to be adapted to throw exceptions when exceptional conditions arise, whereas in a programming environment that had exceptions to begin with the analogous library functions would already do this (see Requirement Part 1: 6.4.1.5-A.1).

    6.4.1.5-A Block-structured exception handling

    Application logic SHALL handle exceptions using block-structured exception handling constructs.

    Applies To: Programmed device

    Test Reference: Part 3: 4.5.1 “Workmanship”

    DISCUSSION

    See Part 1: 6.4.1.5 “Structured programming”.

    Source: Extension of [VVSG2005] requirements for structured programming

    6.4.1.5-A.1 Legacy library units must be wrapped

    If application logic makes use of any COTS or third-party logic callable units that do not throw exceptions when exceptional conditions occur, those callable units SHALL be wrapped in callable units that check for the relevant error conditions and translate them into exceptions, and the remainder of application logic SHALL use only the wrapped version.

    Test Reference: Part 3: 4.5.1 “Workmanship”

    DISCUSSION

    For example, if an application written in C99 [ISO99] + cexcept [Sourceforge00] used the malloc function of libc, which returns a null pointer in case of failure instead of throwing an exception, the malloc function would need to be wrapped. Here is one possible implementation:

    void *checkedMalloc (size_t size) {
    	void *ptr = malloc (size);
    	if (!ptr)
    		Throw bad_alloc;
    	return ptr;
    }
    #define malloc checkedMalloc

    Wrapping legacy functions avoids the need to check for errors after every invocation, which both obfuscates the application logic and creates a high likelihood that some or many possible errors will not be checked for. In C++, it would be preferable to use one of the newer mechanisms that already throw exceptions on failure and avoid use of legacy functions altogether.

    Source: New requirement

    6.4.1.5-B Unstructured control flow is prohibited

    Application logic SHALL contain no unstructured control constructs.

    Applies To: Programmed device

    Test Reference: Part 3: 4.5.1 “Workmanship”

    DISCUSSION

    See the discussion for Requirement Part 1: 6.4.1.2-A regarding border logic.

    Source: Generalization and summary of [VVSG2005] I.5.2.4 and II.5.4.1

    Arbitrary branches (a.k.a. gotos) are prohibited.

    Test Reference: Part 3: 4.5.1 “Workmanship”

    Source: Generalization and summary of [VVSG2005] I.5.2.4 and II.5.4.1

    6.4.1.5-B.2 Intentional exceptions

    Exceptions SHALL only be used for abnormal conditions. Exceptions SHALL NOT be used to redirect the flow of control in normal ("non-exceptional") conditions.

    Test Reference: Part 3: 4.5.1 “Workmanship”

    DISCUSSION

    "Intentional exceptions" cannot be used as a substitute for arbitrary branch. Normal, expected events, such as reaching the end of a file that is being read from beginning to end or receiving invalid input from a user interface, are not exceptional conditions and should not be implemented using exception handlers.

    Source: [VSS2002] I.4.2.4.d, II.5.4.1.c / [VVSG2005] I.5.2.4.a.iii, II.5.4.1

    6.4.1.5-B.3 Unstructured exception handling

    Unstructured exception handling (e.g., On Error GoTo, setjmp/longjmp, or explicit tests for error conditions after every executable statement) is prohibited.

    Test Reference: Part 3: 4.5.1 “Workmanship”

    DISCUSSION

    The internal use of such constructs by a COTS extension package that adds block-structured exception handling to a programming language that otherwise would not have it, as described in Requirement Part 1: 6.4.1.2-A.1, is allowed. Analogously, it is not a problem that source code written in a high-level programming language is compiled into low-level machine code that contains arbitrary branches. It is only the direct use of low-level constructs in application logic that presents a problem.

    Source: Extension of [VVSG2005] requirements for structured programming

    6.4.1.5-C Separation of code and data

    Application logic SHALL NOT compile or interpret configuration data or other input data as a programming language.

    Test Reference: Part 3: 4.5.1 “Workmanship”

    DISCUSSION

    The requirement in [VVSG2005] read "Operator intervention or logic that evaluates received or stored data shall not re-direct program control within a program routine." That attempt to define what it means to compile or interpret data as a programming language caused confusion.

    Distinguishing what is a programming language from what is not requires some professional judgment. However, in general, sequential execution of imperative instructions is a characteristic of conventional programming languages that should not be exhibited by configuration data. Configuration data must be declarative or informative in nature, not imperative.

    For example: it is permissible for configuration data to contain a template that informs a report generating application as to the form and content of a report that it should generate, but it is not permissible for configuration data to contain instructions that are executed or interpreted to generate a report, essentially embedding the logic of the report generator inside the configuration data.

    The reasons for this requirement are (1) mingling code and data is bad design, and (2) embedding logic within configuration data is an evasion of the conformity assessment process for application logic.

    See also Requirement Part 1: 6.4.1.7-A.3 and Requirement Part 1: 6.4.1.7-A.4.

    Source: Clarification of [VSS2002] I.4.2.4.d and II.5.4.1.c / [VVSG2005] I.5.2.4.a.iii and II.5.4.1 paragraph 4

    6.4.1.6 Comments

    6.4.1.6-A Header Comments

    Application logic modules SHOULD include header comments that provide at least the following information for each callable unit (function, method, operation, subroutine, procedure, etc.):

    1. The purpose of the unit and how it works (if not obvious);
    2. A description of input parameters, outputs and return values, exceptions thrown, and side-effects;
    3. Any protocols that must be observed (e.g., unit calling sequences);
    4. File references by name and method of access (read, write, modify, append, etc.);
    5. Global variables used (if applicable);
    6. Audit event generation;
    7. Date of creation; and
    8. Change log (revision record).

    Applies To: Programmed device

    Test Reference: Part 3: 4.5.1 “Workmanship”

    DISCUSSION

    Header comments and other commenting conventions should be specified by the selected coding conventions in a manner consistent with the idiom of the programming language chosen. If the coding conventions specify a coding style and commenting convention that make header comments redundant, then they may be omitted. Otherwise, in the event that the coding conventions fail to specify the content of header comments, the non-redundant portions of this generic guideline should be applied.

    Change logs need not cover the nascent period, but they must go back as far as the first baseline or release that is submitted for testing, and should go back as far as the first baseline or release that is deemed reasonably coherent.

    Source: Revised from [VSS2002] I.4.2.7.a

    6.4.1.7 Executable code and data integrity

    Portions of this section are from or derived from [P1583], as noted in requirements and discussion text[3],[4].

    6.4.1.7-A Code coherency

    Application logic SHALL conform to the following subrequirements.

    Applies To: Programmed device

    Test Reference: Part 3: 4.5.1 “Workmanship”

    DISCUSSION

    This is to scope the following subrequirements to Application logic. For COTS software where source code is unobtainable, they would be unverifiable.

    6.4.1.7-A.1 Self-modifying code

    Self-modifying code is prohibited.

    Test Reference: Part 3: 4.5.1 “Workmanship”

    Source: [VSS2002] I.4.2.2

    6.4.1.7-A.2 Unsafe concurrency

    Application logic SHALL be free of race conditions, deadlocks, livelocks, and resource starvation.

    Test Reference: Part 3: 3.1 “Inspection”, 3.2 “Functional Testing”

    Source: New requirement

    6.4.1.7-A.3 Code integrity, no strange compilers

    If compiled code is used, it SHALL only be compiled using a COTS compiler.

    Test Reference: Part 3: 4.5.1 “Workmanship”

    DISCUSSION

    This prohibits the use of arbitrary, nonstandard compilers and consequently the invention of new programming languages.

    Source: New requirement

    6.4.1.7-A.4 Interpreted code, specific COTS interpreter

    If interpreted code is used, it SHALL only be run under a specific, identified version of a COTS runtime interpreter.

    Test Reference: Part 3: 4.5.1 “Workmanship”

    DISCUSSION

    This ensures that (1) no arbitrary, nonstandard interpreted languages are used, and (2) the software tested and approved during the conformity assessment process does not change behavior because of a change to the interpreter.

    Source: [P1583] Section 5.6.2.2

    6.4.1.7-B Prevent tampering with code

    Programmed devices SHALL prevent replacement or modification of executable or interpreted code (e.g., by other programs on the system, by people physically replacing the memory or medium containing the code, or by faulty code) except where this access is necessary to conduct the voting process.

    Applies To: Programmed device

    Test Reference: Part 3: 4.5.1 “Workmanship”

    DISCUSSION

    This requirement may be partially satisfied through a combination of read-only memory (ROM), the memory protection implemented by most popular COTS operating systems, error checking as described in Part 1: 6.4.1.8 “Error checking”, and access and integrity controls.

    Source: Rewording/expansion of [VSS2002] I.4.2.2

    6.4.1.7-C Prevent tampering with data

    All voting devices SHALL prevent access to or manipulation of configuration data, vote data, or audit records (e.g., by physical tampering with the medium or mechanism containing the data, by other programs on the system, or by faulty code) except where this access is necessary to conduct the voting process.

    Applies To: Voting device

    Test Reference: Part 3: 4.5.1 “Workmanship”

    DISCUSSION

    This requirement may be partially satisfied through a combination of the memory protection implemented by most popular COTS operating systems, error checking as described in Part 1: 6.4.1.8 “Error checking”, and access and integrity controls. Systems using mechanical counters to store vote data must protect the counters from tampering. If vote data are stored on paper, the paper must be protected from tampering. Modification of audit records after they are created is never necessary.

    Source: Rewording/expansion of [VSS2002] I.4.2.2

    6.4.1.7-D Monitor I/O errors

    Programmed devices SHALL provide the capability to monitor the transfer quality of I/O operations, reporting the number and types of errors that occur and how they were corrected.

    Applies To: Programmed device

    Test Reference: Part 3: 4.5.1 “Workmanship”

    Source: [VSS2002] I.2.2.2.1.e

    6.4.1.8 Error checking

    This section contains requirements for application logic to avoid, detect, and prevent well-known types of errors that could compromise voting integrity and security[5],[6]. Additional advice from the security perspective is available at [CERT06] and related sites, esp. [DHS06].

    6.4.1.8-A Detect garbage input

    Programmed devices SHALL check information inputs for completeness and validity.

    Applies To: Programmed device

    Test Reference: Part 3: 4.5.1 “Workmanship”

    DISCUSSION

    This general requirement applies to all programmed devices, while the specific ones following are only enforceable for application logic.

    Source: [NIST05] [S-I-10]

    6.4.1.8-A.1 Defend against garbage input

    Programmed devices SHALL ensure that incomplete or invalid inputs do not lead to irreversible error.

    Test Reference: Part 3: 4.5.1 “Workmanship”

    Source: [VSS2002] I.2.2.5.2.2.f

    6.4.1.8-B Mandatory internal error checking

    Application logic that is vulnerable to the following types of errors SHALL check for these errors at run time and respond defensively (as specified by Requirement Part 1: 6.4.1.8-F) when they occur:

    1. Out-of-bounds accesses of arrays or strings (includes buffers used to move data);
    2. Stack overflow errors;
    3. CPU-level exceptions such as address and bus errors, dividing by zero, and the like;
    4. Variables that are not appropriately handled when out of expected boundaries;
    5. Numeric overflows; or
    6. Known programming language specific vulnerabilities.

    Applies To: Programmed device

    Test Reference: Part 3: 4.5.1 “Workmanship”

    DISCUSSION

    It is acceptable, even expected, that logic verification will show that some error checks cannot logically be triggered and some exception handlers cannot logically be invoked. These checks and exception handlers are not redundant – they provide defense-in-depth against faults that escape detection during logic verification.

    See also Requirement Part 1: 7.5.6-A.

    Source: [P1583] Section 5.6.2.2 expansion of [VSS2002] I.4.2.2, modified

    6.4.1.8-B.1 Array overflows

    If the application logic uses arrays, vectors, or any analogous data structures and the programming language does not provide automatic run-time range checking of the indices, the indices SHALL be ranged-checked on every access.

    Test Reference: Part 3: 4.5.1 “Workmanship”

    DISCUSSION

    Range checking code should not be duplicated before each access. Clean implementation approaches include:

    1. Consistently using dedicated accessors (functions, methods, operations, subroutines, procedures, etc.) that range-check the indices;
    2. Defining and consistently using a new data type or class that encapsulates the range-checking logic;
    3. Declaring the array using a template that causes all accessors to be range-checked; or
    4. Declaring the array index to be a data type whose enforced range is matched to the size of the array.

    Range-enforced data types or classes may be provided by the programming environment or they may be defined in application logic.

    If acceptable values of the index do not form a contiguous range, a map structure may be more appropriate than a vector.

    Source: Expansion of [VSS2002] I.4.2.2

    6.4.1.8-B.2 Stack overflows

    If stack overflow does not automatically result in an exception, the application logic SHALL explicitly check for and prevent stack overflow.

    Test Reference: Part 3: 4.5.1 “Workmanship”

    DISCUSSION

    Embedded system developers use a variety of techniques for avoiding stack overflow. Commonly, the stack is monitored and warnings and exceptions are thrown when thresholds are crossed. In non-embedded contexts, stack overflow often manifests as a CPU-level exception related to memory segmentation, in which case it can be handled pursuant to Requirement Part 1: 6.4.1.8-B.3 and Requirement Part 1: 6.4.1.9-D.2.

    Source: Added precision

    6.4.1.8-B.3 CPU traps

    The application logic SHALL implement such handlers as are needed to detect and respond to CPU-level exceptions.

    Test Reference: Part 3: 4.5.1 “Workmanship”

    DISCUSSION

    For example, under Unix a CPU-level exception would manifest as a signal, so a signal handler is needed. If the platform supports it, it is preferable to translate CPU-level exceptions into software-level exceptions so that all exceptions can be handled in a consistent fashion within the voting application; however, not all platforms support it.

    Source: Added precision

    6.4.1.8-B.4 Garbage input parameters

    All scalar or enumerated type parameters whose valid ranges as used in a callable unit (function, method, operation, subroutine, procedure, etc.) do not cover the entire ranges of their declared data types SHALL be range-checked on entry to the unit.

    Test Reference: Part 3: 4.5.1 “Workmanship”

    DISCUSSION

    This applies to parameters of numeric types, character types, temporal types, and any other types for which the concept of range is well-defined.[7] In cases where the restricted range is frequently used and/or associated with a meaningful concept within the scope of the application, the best approach is to define a new class or data type that encapsulates the range restriction, eliminating the need for range checks on each use.

    This requirement differs from Requirement Part 1: 6.4.1.8-A, which deals with user input that is expected to contain errors, while this requirement deals with program internal parameters, which are expected to conform to the expectations of the designer. User input errors are a normal occurrence; the errors discussed here are grounds for throwing exceptions.

    Source: Elaboration on Requirement Part 1: 6.4.1.8-B.d, which is an expansion of [VSS2002] I.4.2.2

    6.4.1.8-B.5 Numeric overflows

    If the programming language does not provide automatic run-time detection of numeric overflow, all arithmetic operations that could potentially overflow the relevant data type SHALL be checked for overflow.

    Test Reference: Part 3: 4.5.1 “Workmanship”

    DISCUSSION

    This requirement should be approached in a manner similar to Requirement Part 1: 6.4.1.8-B.1. Overflow checking should be encapsulated as much as possible.

    Source: Added precision

    6.4.1.8-C Recommended internal error checking

    Application logic that is vulnerable to the following types of errors SHOuLD check for these errors at run time and respond defensively (as specified by Requirement Part 1: 6.4.1.8-F) when they occur.

    1. Pointer variable errors; and
    2. Dynamic memory allocation and management errors

    Applies To: Programmed device

    Test Reference: Part 3: 4.5.1 “Workmanship”

    Source: [P1583] Section 5.6.2.2 expansion of [VSS2002] I.4.2.2, modified

    6.4.1.8-C.1 Pointers

    If application logic uses pointers or a similar mechanism for specifying absolute memory locations, the Application logic SHOuLD validate pointers or addresses before they are used.

    Test Reference: Part 3: 4.5.1 “Workmanship”

    DISCUSSION

    Improper overwriting should be prevented in general as required by Requirement Part 1: 6.4.1.7-B and Requirement Part 1: 6.4.1.7-C. Nevertheless, even if read-only memory would prevent the overwrite from succeeding, an attempted overwrite indicates a logic fault that must be corrected.

    Pointer use that is fully encapsulated within a standard platform library is treated as COTS software.

    Source: Slight revision of [P1583] 6.6.4.2.e

    6.4.1.8-D Memory mismanagement

    If dynamic memory allocation is performed in application logic, the application logic SHOULD be instrumented and/or analyzed with a COTS tool for detecting memory management errors.

    Applies To: Programmed device

    Test Reference: Part 3: 4.4 “Manufacturer Practices for Quality Assurance and Configuration Management”

    DISCUSSION

    Dynamic memory allocation that is fully encapsulated within a standard platform library is treated as COTS software. This is "should" not "shall" only because such tooling may not be available or applicable in all cases. See [Valgrind07] discussion of supported platforms and the barriers to portability.

    6.4.1.8-E Nullify freed pointers

    If pointers are used, any pointer variables that remain within scope after the memory they point to is deallocated SHALL be set to null or marked as invalid (pursuant to the idiom of the programming language used) after the memory they point to is deallocated.

    Applies To: Programmed device

    Test Reference: Part 3: 4.5.1 “Workmanship”

    DISCUSSION

    If this is not done automatically by the programming environment, a callable unit should be dedicated to the task of deallocating memory and nullifying pointers. Equivalently, "smart pointers" like the C++ std::auto_ptr can be used to avoid the problem. One should not add assignments after every deallocation in the source code.

    In languages using garbage collection, memory is not deallocated until all pointers to it have gone out of scope, so this requirement is moot.

    Source: New requirement

    6.4.1.8-F React to errors detected

    The detection of any of the errors enumerated in Requirement Part 1: 6.4.1.8-B and Requirement Part 1: 6.4.1.8-C SHALL be treated as a complete failure of the callable unit in which the error was detected. An appropriate exception SHALL be thrown and control SHALL pass out of the unit forthwith.

    Applies To: Programmed device

    Test Reference: Part 3: 4.5.1 “Workmanship”

    6.4.1.8-G Do not disable error checks

    Error checks detailed in Requirement Part 1: 6.4.1.8-B and Requirement Part 1: 6.4.1.8-C SHALL remain active in production code.

    Applies To: Programmed device

    Test Reference: Part 3: 4.5.1 “Workmanship”

    DISCUSSION

    These errors are incompatible with voting integrity, so masking them is unacceptable.

    Manufacturers should not implement error checks using the C/C++ assert() macro. It is often disabled, sometimes automatically, when software is compiled in production mode. Furthermore, it does not appropriately throw an exception, but instead aborts the program.

    "Inevitably, the programmed validity checks of the defensive programming approach will result in run-time overheads and, where performance demands are critical, many checks are often removed from the operational software; their use is restricted to the testing phase where they can identify the misuse of components by faulty designs. In the context of producing complex systems which can never be fully tested, this tendency to remove the protection afforded by programmed validity checks is most regrettable and is not recommended here." [Moulding89]

    6.4.1.8-H Roles authorized to respond to errors

    Exceptions resulting from failed error checks or CPU-level exceptions SHALL require intervention by an election official or administrator before voting can continue.

    Applies To: Programmed device

    Test Reference: Part 3: 4.5.1 “Workmanship”

    DISCUSSION

    These errors are incompatible with voting integrity, so masking them is unacceptable.

    6.4.1.8-I Diagnostics

    Electronic devices SHALL include a means of identifying device failure and any corrective action needed.

    Applies To: Electronic device

    Test Reference: Part 3: 4.5.1 “Workmanship”

    Source: Generalized from [VSS2002] I.2.4.1.2.2.c and I.2.4.1.3.d

    6.4.1.8-J Equipment health monitoring

    Electronic devices SHOuLD proactively detect equipment failures and alert an election official or administrator when they occur.

    Applies To: Electronic device

    Test Reference: Part 3: 4.5.1 “Workmanship”

    Source: Response to Issue #2147

    6.4.1.8-K Election integrity monitoring

    To the extent possible, electronic devices SHALL proactively detect or prevent basic violations of election integrity (e.g., stuffing of the ballot box or the accumulation of negative votes) and alert an election official or administrator if they occur.

    Applies To: Electronic device

    Test Reference: Part 3: 4.5.1 “Workmanship”

    DISCUSSION

    Equipment can only verify those conditions that are within the scope of what the equipment does. However, insofar as the equipment can detect something that is blatantly wrong, it should do so and raise the alarm. This provides defense-in-depth to supplement procedural controls and auditing practices.

    Source: Response to Issue #2147

    6.4.1.9 Recovery

    For specific requirements regarding misfed paper ballots or hangs during the vote-casting function, see Requirement Part 1: 3.2.2.1-F and Requirement Part 1: 3.2.2.2-F, Requirement Part 1: 7.7.4-A and Requirement Part 1: 7.7.4-B.

    6.4.1.9-A System shall survive device failure

    All systems SHALL be capable of resuming normal operation following the correction of a failure in any device.

    Applies To: Voting system

    Test Reference: Part 3: 4.5.1 “Workmanship”

    Source: Extrapolated from [VSS2002] I.2.2.3

    6.4.1.9-B Failures shall not compromise voting or audit data

    Exceptions and system recovery SHALL be handled in a manner that protects the integrity of all recorded votes and audit log information.

    Test Reference: Part 3: 4.5.1 “Workmanship”

    Source: Extracted and generalized from [VSS2002] I.4.2.3.e

    6.4.1.9-C Device shall survive component failure

    All voting devices SHALL be capable of resuming normal operation following the correction of a failure in any component (e.g., memory, CPU, ballot reader, printer) provided that catastrophic electrical or mechanical damage has not occurred.

    Applies To: Voting device

    Test Reference: Part 3: 4.5.1 “Workmanship”

    Source: Reworded from [VSS2002] I.2.2.3.b and c

    6.4.1.9-D Controlled recovery

    Error conditions SHALL be corrected in a controlled fashion so that system status may be restored to the initial state existing before the error occurred.

    Applies To: Programmed device

    Test Reference: Part 3: 4.5.1 “Workmanship”

    DISCUSSION

    "Initial state" refers to the state existing at the start of a logical transaction or operation. Transaction boundaries must be defined in a conscientious fashion to minimize the damage. Language changed to "may" because election officials responding to the error condition might want the opportunity to select a different state (e.g., controlled shutdown with memory dump for later analysis).

    Source: Generalization from [VSS2002] I.2.2.5.2.2.g.

    6.4.1.9-D.1 Nested error conditions

    Nested error conditions that are corrected without reset, restart, reboot, or shutdown of the voting device SHALL be corrected in a controlled sequence so that system status may be restored to the initial state existing before the first error occurred.

    Test Reference: Part 3: 4.5.1 “Workmanship”

    Source: Slight relaxation of [VSS2002] I.2.2.5.2.2.g

    6.4.1.9-D.2 Reset CPU error states

    CPU-level exceptions that are corrected without reset, restart, reboot, or shutdown of the voting device SHALL be handled in a manner that restores the CPU to a normal state and allows the system to log the event and recover as with a software-level exception.

    Test Reference: Part 3: 4.5.1 “Workmanship”

    DISCUSSION

    System developers should test to see how CPU-level exceptions are handled and make any changes necessary to ensure robust recovery. Invocation of any other error routine while the CPU is in an exception handling state is to be avoided – software error handlers often do not operate as intended when the CPU is in an exception handling state.

    If the platform supports it, it is preferable to translate CPU-level exceptions into software-level exceptions so that all exceptions can be handled in a consistent fashion within the voting application; however, not all platforms support it.

    Source: Added precision

    6.4.1.9-E Coherent checkpoints

    When recovering from non-catastrophic failure of a device or from any error or malfunction that is within the operator's ability to correct, the system SHALL restore the device to the operating condition existing immediately prior to the error or failure, without loss or corruption of voting data previously stored in the device.

    Applies To: Programmed device

    Test Reference: Part 3: 4.5.1 “Workmanship”

    DISCUSSION

    If, as discussed in Requirement Part 1: 6.4.1.9-D, the system is left in something other than the last known good state for diagnostic reasons, this requirement clarifies that it must revert to the last known good state before being placed back into service.

    Source: [VSS2002] I.2.2.3.a

    6.4.2 Quality assurance and configuration management

    The quality assurance and configuration management requirements discussed in this section help assure that voting systems conform to the requirements of the VVSG. Quality Assurance is a manufacturer function with associated practices that is initiated prior to system development and continues throughout the maintenance life cycle of the voting system. Quality Assurance focuses on building quality into a system and reducing dependence on system tests at the end of the life cycle to detect deficiencies, thus helping ensure that the system:

    • Meets stated requirements and objectives;
    • Adheres to established standards and conventions;
    • Functions consistent with related components and meets dependencies for use within the jurisdiction; and
    • Reflects all changes approved during its initial development, internal testing, qualification, and, if applicable, additional certification processes.

    Configuration management is a set of activities and associated practices that ensures full knowledge and control of the components of a system, starting with its initial development progressing through its ongoing maintenance and enhancement, and including its operational life cycle.

    6.4.2.1 Standards based framework for Quality Assurance and Configuration Management

    The requirement in this section establishes the quality assurance and configuration standards that voting system to which manufacturers must conform. The requirement to develop a Quality and Configuration Management manual, and the detailed requirements on that manual, are contained in Part 2, Chapter 2.

    6.4.2.1-A List of standards

    Voting system manufacturers SHALL implement a quality assurance and configuration management program that is conformant with the recognized ISO standards in these areas:

    1. ISO 9000:2005 [ISO05];
    2. ISO 9001:2000 [ISO00]; and
    3. ISO 10007:2003 [ISO03].

    6.4.2.2 Configuration Management requirements

    This section specifies the key configuration management requirements for voting system manufacturers. The requirements include those of equipment tags and configuration logs. Continuation of the program, in the form of usage logs, is the responsibility of State and local officials.

    6.4.2.2-A Identification of systems

    Each voting system SHALL have an identification tag that is attached to the main body.

    Applies To: Voting system

    Test Reference: Part 3: 3.1 “Inspection”, 4.4.2 “Examination of voting systems submitted for testing”

    Source: New requirement

    6.4.2.2-A.1 Secure tag

    The tag SHALL be tamper-resistant and difficult to remove.

    Applies To: Voting system

    Test Reference: Part 3: 3.1 “Inspection”, 4.4.2 “Examination of voting systems submitted for testing”

    Source: New requirement

    6.4.2.2-A.2 Tag contents

    The tag SHALL contain the following information:

    1. The voting system model identification in the form of a model number and possibly a model name. The model identification identifies the exact variant or version of the system;
    2. The serial number that uniquely identifies the system;
    3. Identification of the manufacturer, including address and contact information for technical service, and manufacturer certification information; and
    4. Date of manufacture of the voting system.

    Applies To: Voting system

    Test Reference:Part 3: 3.1 “Inspection”, 4.4.2 “Examination of voting systems submitted for testing”

    Source: New requirement

    6.4.2.2-B The Voting System Configuration Log

    For each voting system manufactured, a Voting System Configuration Log SHALL be established.

    Applies To: Voting system

    Test Reference:Part 3: 3.1 “Inspection”, 4.4.2 “Examination of voting systems submitted for testing”

    DISCUSSION

    The Log is initialized by the configuration data supplied by the manufacturer. From that point on, it functions like a diary of the system. Entries are made by election officials whenever any change occurs. Every exception, disruption, anomaly, and every failure is recorded. Every time the cover is opened for inspection or a repair or maintenance is performed, an entry details what was done, and what component was changed against what other component, as well as any diagnosis of failures or exceptions.

    Source: New requirement

    6.4.2.2-B.1 Contents

    The Log SHALL contain the following information:

    1. The information on the system tag described in Requirement 6. 4.2.2-A.2;
    2. The identification of all critical parts, components, and assemblies of the system; and
    3. The complete historical record, as developed by the manufacturer per Requirement Part 2: 2.1-A.12, of all critical parts, components, and assemblies included in the voting system.

    Applies To: Voting system

    Test Reference: Part 3: 3.1 “Inspection”, 4.4.2 “Examination of voting systems submitted for testing”

    DISCUSSION

    The list of critical parts, components, and assemblies should be consistent with the rules for determining which of these entities is critical, as specified in the Quality and Configuration Manual. See Requirement Part 2: 2.1-A.6.

    Source: New requirement

    The Log SHALL be kept on a medium that allows the writing, but not the modification or deletion, of records.

    Applies To: Voting system

    Test Reference: Part 3: 3.1 “Inspection”, 4.4.2 “Examination of voting systems submitted for testing”

    Source: New requirement

    6.4.3 General build quality

    6.4.3-A General build quality

    All manufacturers of voting systems SHALL practice proper workmanship.

    Applies To: Voting system

    Test Reference: Part 3: 4.3 “Verification of Design Requirements”

    Source: New requirement

    6.4.3-A.1 High quality products

    All manufacturers SHALL adopt and adhere to practices and procedures to ensure that their products are free from damage or defect that could make them unsatisfactory for their intended purpose.

    Applies To: Voting system

    Test Reference: Part 3: 4.3 “Verification of Design Requirements”

    Source: [VSS2002] I.3.4.7.a / [VVSG2005] I.4.3.7.a

    6.4.3-A.2 High quality parts

    All manufacturers SHALL ensure that components provided by external suppliers are free from damage or defect that could make them unsatisfactory or hazardous when used for their intended purpose.

    Applies To: Voting system

    Test Reference: Part 3: 4.3 “Verification of Design Requirements”

    Source: [VSS2002] I.3.4.7.b / [VVSG2005] I.4.3.7.b

    6.4.3-B Suitability of COTS Components

    Manufacturers SHALL ensure that all COTS components included in their voting systems are designed to be suitable for their intended use under the requirements specified by these VVSG.

    Applies To: Voting system

    Test Reference: Requirement Part 3: 4.1-B

    DISCUSSION

    For example, if the operating and/or storage environmental conditions specified by the manufacturer of a printer do not meet or exceed the requirements of these VVSG, a system that includes that printer cannot be found conforming.

    Source: New requirement

    6.4.4 Durability

    6.4.4-A Durability

    Voting systems SHALL be designed to withstand normal use without deterioration for a period of ten years.

    Applies To: Voting system

    Test Reference: Part 3: 4.3 “Verification of Design Requirements”

    Source: [VSS2002] I.3.4.2 / [VVSG2005] I.4.3.2

    6.4.4-B Durability of paper

    Paper specified for use with the voting system SHALL conform to the applicable specifications contained within the Government Paper Specification Standards, February 1999 No. 11, or the government standards that have superseded them.

    Applies To: Voting system

    Test Reference: Part 3: 4.1 “Initial Review of Documentation”

    DISCUSSION

    This is to ensure that paper records will be of adequate quality to survive the handling necessary for recounts, audits, etc. without problematic degradation. The Government Paper Specification Standards include different specifications for different kinds of paper. As of 2007-04-05, the Government Paper Specification Standards, February 1999 No. 11, are available at http://www.gpo.gov/acquisition/paperspecs.htm [GPO99].

    Source: New requirement

    6.4.5 Maintainability

    Maintainability represents the ease with which maintenance actions can be performed based on the design characteristics of equipment and software and the processes the manufacturer and election officials have in place for preventing failures and for reacting to failures. Maintainability includes the ability of equipment and software to self-diagnose problems and to make non-technical election workers aware of a problem. Maintainability addresses all scheduled and unscheduled events, which are performed to:

    • Determine the operational status of the system or a component;
    • Determine if there is a problem with the equipment and be able to take it off-line (out of service) while retaining all cast ballot data;
    • Adjust, align, tune, or service components;
    • Repair or replace a component having a specified operating life or replacement interval;
    • Repair or replace a component that exhibits an undesirable predetermined physical condition or performance degradation;
    • Repair or replace a component that has failed;
    • Ensure that, by following manufacturer protocols provided in the TDP, all repairs or replacements of devices or components during election use preserve all stored ballot data and/or election results, as appropriate; and
    • Verify the restoration of a component, or the system, to operational status.

    Maintainability is determined based on the presence of specific physical attributes that aid system maintenance activities, and the ease with which the testing laboratory can perform system maintenance tasks. Although a more quantitative basis for assessing maintainability, such as the mean time to repair the system, is desirable, laboratory testing of a system is conducted before it is approved for sale and thus before a broader base of maintenance experience can be obtained.

    6.4.5-A Electronic device maintainability

    Electronic devices SHALL exhibit the following physical attributes:

    1. Labels and the identification of test points;
    2. Built-in test and diagnostic circuitry or physical indicators of condition;
    3. Labels and alarms related to failures; and
    4. Features that allow non-technicians to perform routine maintenance tasks.

    Applies To: Electronic device

    Test Reference: Part 3: 4.3 “Verification of Design Requirements”

    Source: [VSS2002] I.3.4.4.1 / [VVSG2005] I.4.3.4.1

    6.4.5-B System maintainability

    Voting systems SHALL allow for:

    1. A non-technician to easily detect that the equipment has failed;
    2. A trained technician to easily diagnose problems;
    3. Easy access to components for replacement;
    4. Easy adjustment, alignment, and tuning of components; and
    5. Low false alarm rates (i.e., indications of problems that do not exist).

    Applies To: Voting system

    Test Reference: Part 3: 4.3 “Verification of Design Requirements”

    Source: [VSS2002] I.3.4.4.2 / [VVSG2005] I.4.3.4.2

    6.4.5-C Nameplate and labels

    All voting devices SHALL:

    1. Display a permanently affixed nameplate or label containing the name of the manufacturer or manufacturer, the name of the device, its part or model number, its revision identifier, its serial number, and if applicable, its power requirements;
    2. Display a separate data plate containing a schedule for and list of operations required to service or to perform preventive maintenance, or a reference to where this can be found in the Voting Equipment User Documentation; and
    3. Display advisory caution and warning instructions to ensure safe operation of the equipment and to avoid exposure to hazardous electrical voltages and moving parts at all locations where operation or exposure may occur.

    Applies To: Voting device

    Test Reference: Part 3: 4.3 “Verification of Design Requirements”

    Source: [VSS2002] I.3.4.6

    6.4.6 Temperature and humidity

    6.4.6-A Operating temperature and humidity

    Voting systems SHALL be capable of operation in temperatures ranging from 5 °C to 40 °C (41 °F to 104 °F) and relative humidity from 5 % to 85 %, non-condensing.[8]

    Applies To: Voting system

    Test Reference: Part 3: 5.1.5 “Operating environmental testing”

    Source: [P1583] 5.4.5[5]

    6.4.7 Equipment transportation and storage

    This section address items such as touchscreens going out of calibration and memory packs failing after delivery from central to precinct, and high rates of system failure when taken out of storage.

    6.4.7-A Survive transportation

    Voting devices designated for storage between elections SHALL continue to meet all applicable requirements after transit to and from the place of use.

    Applies To: Voting device

    Test Reference: Part 3: 5.1 “Hardware”

    Source: [VSS2002] I.2.6.a / [VVSG2005] I.2.5.a, generalized

    6.4.7-B Survive storage

    Voting devices designated for storage between elections SHALL continue to meet all applicable requirements after storage between elections.

    Applies To: Voting device

    Test Reference: Part 3: 5.1 “Hardware”

    Source: [VSS2002] I.2.6.b / [VVSG2005] I.2.5.b, generalized

    6.4.7-C Precinct devices storage

    Precinct tabulators and vote-capture devices SHALL be designed for storage in any enclosed facility ordinarily used as a warehouse, with prominent instructions as to any special storage requirements.

    Applies To: Precinct tabulator, Vote-capture device

    Test Reference: Part 3: 4.3 “Verification of Design Requirements”

    Source: [VSS2002] I.3.2.2.1 / [VVSG2005] I.4.1.2.1

    6.4.7-C.1 Design for storage and transportation

    Precinct tabulators and vote-capture devices SHALL:

    1. Provide a means to safely and easily handle, transport, and install polling place equipment, such as wheels or a handle or handles; and
    2. Be capable of using, or be provided with, a protective enclosure rendering the equipment capable of withstanding (1) impact, shock and vibration loads accompanying surface and air transportation, and (2) stacking loads accompanying storage.

    Test Reference: Part 3: 4.3 “Verification of Design Requirements”

    Source: [VSS2002] I.3.3.3 / [VVSG2005] I.4.2.3

    6.4.7-D Transportation and storage conditions benchmarks

    Voting devices SHALL meet specific minimum performance requirements for transportation and storage.

    Applies To: Voting device

    Test Reference: Part 3: 5.1 “Hardware”

    DISCUSSION

    The requirements simulate exposure to physical shock and vibration associated with handling and transportation by surface and air common carriers, and to temperature conditions associated with delivery and storage in an uncontrolled warehouse environment.

    Source: [VSS2002] I.3.2.2.14, modified by [P1583] 5.4.6[5]

    6.4.7-D.1 Storage temperature

    Voting devices SHALL withstand high and low storage temperatures ranging from –20 °C to 60 °C (–4 °F to 140 °F).

    Applies To: Voting device

    Test Reference: Part 3: 5.1 “Hardware”

    Source: [VSS2002] I.3.2.2.14.a, modified by [P1583] 5.4.6.a[5]

    6.4.7-D.2 Bench handling

    Voting devices SHALL withstand bench handling equivalent to the procedure of MIL-STD-810D, Method 516.3, Procedure VI. [MIL83].

    Applies To: Voting device

    Test Reference: Part 3: 5.1 “Hardware”

    Source: [VSS2002] I.3.2.2.14.b

    6.4.7-D.3 Vibration

    Voting devices SHALL withstand vibration equivalent to the procedure of MIL-STD-810D, Method 514.3, Category 1—Basic Transportation, Common Carrier [MIL83].

    Applies To: Voting device

    Test Reference: Part 3: 5.1 “Hardware”

    Source: [VSS2002] I.3.2.2.14.c

    6.4.7-D.4 Storage humidity

    Voting devices SHALL withstand uncontrolled humidity equivalent to the procedure of MIL-STD-810D, Method 507.2, Procedure I-Natural Hot-Humid [MIL83].

    Applies To: Voting device

    Test Reference: Part 3: 5.1 “Hardware”

    Source: [VSS2002] I.3.2.2.14.d

    6.5 Archival Requirements

    6.5.1 Archivalness of media

    See Appendix A for the definition of archivalness.

    6.5.1-A Records last at least 22 months

    All systems SHALL maintain the integrity of election management, voting and audit data, including CVRs, during an election and for a period of at least 22 months afterward, in temperatures ranging from 5 °C to 40 °C (41 °F to 104 °F) and relative humidity from 5 % to 85 %, non-condensing.

    Applies To: Voting system

    Test Reference: Part 3: 4.3 “Verification of Design Requirements”

    DISCUSSION

    See also Requirement Part 1: 6.5.2, Part 1: 6.5.3 and Requirement Part 2: 4.4.8-C.

    Source: Merged from [VSS2002] I.2.2.11 and I.3.2.3.2; temperature and humidity harmonized with Requirement Part 1: 6.4.6-A

    6.5.2 Procedures required for correct system functioning

    The requirements for voting systems are written assuming that these procedures will be followed.

    Statutory period of retention: All printed copy records produced by the election database and ballot processing systems must be labeled and archived for a period of at least 22 months after the election. ([VSS2002] I.2.2.11) See also Requirement Part 1: 6.5.1-A and Part 1: 6.5.3.

    6.5.3 Period of retention (informative)

    This informative section provides extended discussion for Requirement Part 1: 6.5.1-A and Part 1: 6.5.2.

    United States Code Title 42, Sections 1974 through 1974e, states that election administrators must preserve for 22 months "all records and paper that came into (their) possession relating to an application, registration, payment of poll tax, or other act requisite to voting." This retention requirement applies to systems that will be used at any time for voting of candidates for federal offices (e.g., Member of Congress, United States Senator, and/or Presidential Elector). Therefore, all systems must provide for maintaining the integrity of voting and audit data during an election and for a period of at least 22 months thereafter.

    Because the purpose of this law is to assist the federal government in discharging its law enforcement responsibilities in connection with civil rights and elections crimes, its scope must be interpreted in keeping with that objective. The appropriate state or local authority must preserve all records that may be relevant to the detection and prosecution of federal civil rights or election crimes for the 22-month federal retention period, if the records were generated in connection with an election that was held in whole or in part to select federal candidates. It is important to note that Section 1974 does not require that election officials generate any specific type or classification of election record. However, if a record is generated, Section 1974 comes into force and the appropriate authority must retain the records for 22 months.

    For 22-month document retention, the general rule is that all printed copy records produced by the election database and ballot processing systems must be so labeled and archived. Regardless of system type, all audit trail information spelled out in Part 1: 5.7 must be retained in its original format, whether that be real-time logs generated by the system, or manual logs maintained by election personnel. The election audit trail includes not only in-process logs of election night (and subsequent processing of absentee or provisional ballots), but also time logs of baseline ballot definition formats, and system readiness and testing results.

    In many voting systems, the source of election-specific data (and ballot styles) is a database or file. In precinct count systems, this data is used to program each machine, establish ballot layout, and generate tallying files. It is not necessary to retain this information on electronic media if there is an official, authenticatable printed copy of all final database information. However, it is recommended that the state or local jurisdiction also retain electronic records of the aggregate data for each device so that reconstruction of an election is possible without data re-entry. The same requirement and recommendation applies to vote results generated by each precinct device or system.

    6.6 Integratability and Data Export/Interchange

    The requirements in this section deal with making voting device interfaces and data formats transparent and interchangeable. The advantages of transparency and interchangeability include that systems and devices may work across different manufacturers and that data can be conveniently aggregated and analyzed across different platforms. The requirements address (a) integratability of hardware and (b) common public formats for data. The requirements in this section do not address or mandate true interoperability of interfaces and data, however they reduce the barriers to interoperability.

    Integratability deals with the physical and technical aspects of connections between systems and devices, which include hardware and firmware, protocols, etc. Basic integratability of devices is achieved through use of common, standard hardware interfaces and interface protocols such as USB. Thus, a printer port must not be proprietary; it must use a common hardware interface and interface protocol, with the goal being that printers of similar type should be interchangeable.

    Systems and devices that are integratable are designed such that components of systems may be compatible or can be made compatible with each other through some moderate amount of effort, for example, by writing "glue code." For example, an audit device may be designed to work with a DRE, but it may require adaptations to protocols for signaling or data exchange. Adapting the audit interface to the DRE may require some amount of software modification but should still be within reasonable bounds.

    The barriers to interoperability are further reduced if all systems support the same commonly agreed upon, publicly-available data format for ballot definition, records and reports. The advantages to using common data formats include:

    • Common formats for specifying election programming data such as ballot definition files promotes greater accuracy and reduces duplication;
    • Common exported data formats can assist in aggregating results and conducting analyses and audits across among manufacturer systems; and
    • Common formats for use in data reports can be mapped as necessary to locality-specific reports as opposed to requiring the device to export the report in the locality-specific format.

    Although these requirements do not mandate a specific standard data format, manufacturers are encouraged to use consensus-based, publicly available formats such as the OASIS Election Markup Language (EML) standard [OASIS07] or those emanating from the IEEE Voting System Electronic Data Interchange Project 1622 [P1622].

    The requirements in this section mandate the following:

    • Common hardware interfaces;
    • Non-restrictive, publicly available formats for data export and interchange; and
    • Documentation for the format and for how the manufacturer has implemented it, including sample source code for reading the format.

    The requirements promote, but do not mandate the following:

    • Integration of voting devices from different manufacturers;
    • Non-restrictive, publicly available formats for data export and interchange and reports among each manufacturer’s products; and
    • Non-restrictive, publicly available formats for data export and interchange and reports across all manufacturer products.
    6.6-A Integratability of systems and devices

    Systems SHALL maximize integratability with other systems and/or devices of other systems.

    Applies To: Voting system

    Test Reference: Part 3: 3.5 “Interoperability Testing”, 4.3 “Verification of Design Requirements”

    DISCUSSION

    This is a goal-oriented requirement to promote interoperability of voting system devices among and across manufacturers.

    Source: Generalized from database design requirements in [VSS2002] I.2.2.6 and some state RFP(s)

    6.6-A.1 Standard device interfaces

    Standard, common hardware interfaces and protocols SHALL be used to connect devices.

    Applies To: Voting system

    Test Reference: Part 3: 3.5 “Interoperability Testing”, 4.3 “Verification of Design Requirements”

    DISCUSSION

    Standard hardware interfaces must be used to connect devices.

    Source: VVSG 2005 Section 7.9.4

    6.6-B Data export and exchange format

    Data that is exported and exchanged between systems and devices SHALL use a non-restrictive, publicly-available format.

    Applies To: Voting system

    Test Reference: Part 3: 3.5 “Interoperability Testing”, 4.3 “Verification of Design Requirements”

    DISCUSSION

    This is a goal-oriented requirement to promote interoperability of exported data and data exchanged between devices. For example, CVRs exported from different devices should use the same common format so that they can be easily aggregated for use in random audits. Reports from ballot activation devices or other devices that produce reports should use common formats that, if necessary, can be mapped to locality-specific formats.

    Source: VVSG 2005 Section 7.9.3

    6.6-B.1 Exchange of election programming data and report data

    EMSs SHALL use a non-restrictive, publicly-available format with respect to election programming data and report data (the content of vote data reports, audit reports, etc.).

    Applies To: EMS

    Test Reference: Part 3: 3.5 “Interoperability Testing”, 4.3 “Verification of Design Requirements”

    DISCUSSION

    The purpose of this requirement is to further the use of common formats for (a) the specification of election definition files and other election programming, (b) for the report data produced by the EMS such as for status and audit-related reports.

    Source: Generalized from database design requirements in [VSS2002] I.2.2.6 and some state RFP(s)

    6.6-B.2 Exchange of CVRs

    DREs and optical scanners SHALL use a non-restrictive, publicly-available format with respect to export of CVRs.

    Applies To: DRE, Optical Scanner

    Test Reference: Part 3: 3.5 “Interoperability Testing”, 4.3 “Verification of Design Requirements”

    DISCUSSION

    The purpose of this requirement is to further the use of common formats for exported CVRs produced by vote-capture devices.

    Source: Generalized from database design requirements in [VSS2002] I.2.2.6 VVSG 2005 Section 7.9.3, and some state RFP(s)

    6.6-B.3 Exchange of report data

    The voting system SHALL use a non-restrictive, publicly-available format with respect to export of report data.

    Applies To: Voting system

    Test Reference: New requirement

    DISCUSSION

    The purpose of this requirement is to further the use of common formats for reports produced by voting devices.

    Source: Part 3: 3.5 “Interoperability Testing”, 4.3 “Verification of Design Requirements”

    6.6-B.4 Specification of common format usage

    The voting system manufacturer SHALL provide a specification describing how the manufacturer has implemented the format with respect to the manufacturer’s specific voting devices and data, including such items as descriptions of elements, attributes, constraints, extensions, syntax and semantics of the format, and definitions for data fields and schemas.

    Applies To: Voting system

    Test Reference: Part 3: 4.1 “Initial Review of Documentation”

    DISCUSSION

    Conformance to a common format does not guarantee interoperability. The manufacturer must document fully how it has interpreted and implemented the common format for its voting devices and the types of data exchanged/exported.

    Source: VVSG 2005 Section 7.9.3

    6.6-B.5 Source code specification of common format

    The voting system manufacturer SHALL provide a software program with source code to show how the manufacturer has programmatically implemented the format.

    Applies To: Voting system

    Test Reference: Part 3: 4.1 “Initial Review of Documentation”

    Source: VVSG 2005 Section 7.9.3

    6.6-B.6 Common format across manufacturer

    The voting system manufacturer SHOULD use a common format for export and interchange of data and reports across its major device categories.

    Applies To: Voting system

    Test Reference: Part 3: 3.5 “Interoperability Testing”, 4.3 “Verification of Design Requirements”

    DISCUSSION

    Different equipment from the same manufacturer should be interoperable with the respect to data format. For example, a common ballot definition should apply to all manufacturer vote-capture devices and should not be specific to each device. Export of data (e.g., reports and CVRs) should use a common format across all devices.

    Source: New requirement

    6.6-B.7 Consensus-based format

    Voting systems SHOULD use a common, consensus-based format for export and interchange of data and reports.

    Applies To: Voting system

    Test Reference: Part 3: 3.5 “Interoperability Testing”, 4.3 “Verification of Design Requirements”

    DISCUSSION

    Manufacturers should use a consensus-based format that is common to all manufacturers. The OASIS Election Markup Language (EML) standard [OASIS07] is being considered currently as one possible common format. The IEEE P-1622 working group [P1622] is studying several formats for eventual standardization.

    Source: VVSG 2005 Section 7.9.3

    6.7 Procedures required for correct system functioning

    The requirements for voting systems are written assuming that these procedures will be followed.

    Follow instructions: The voting system must be deployed, calibrated, and tested in accordance with the voting equipment user documentation provided by the manufacturer.