QUESTIONS FOR VENDORS

 

New York City Council, Hearing by the Government Operations Committee

November 21, 2005

Submitted by Teresa Hommel

 

 

A. WHAT DID THEY BRING

 

1.a. ES&S -- Why didn't you bring the Automark which is federally certified with your optical scanner? (Several states recently switched to your optical scan systems, including Arizona, Minnesota, and Nebraska -- 1 page per state is attached).

 

1.b. ES&S -- Why didn't you bring the privacy sleeve to demonstrate with the Automark and your optical scanner?

 

2. What is the purchase price of the system you are showing us?

 

3. Is this system federally certified? Is it certified to the 2002 federal standards? If so, what are the NASED certification numbers?

 

4. ES&S -- Does this equipment use the same software as the iVotronic?

Have you fixed the software errors that caused problems with that system in other jurisdictions?

 

5. Do you supply the ballot-programming software with this system, or is there an extra cost to acquire it? If that software is priced separately, how much is it?

 

6. How much do you charge for technical support? If the cost varies, what are the lower and upper amounts?

 

 

B. ACCESSIBILITY

(Principle: If these machines are supposed to be accessible, vendors should have done extensive design and testing in conjunction with groups of voters with disabilities)

 

7. Does this system have to be rebooted when a voter wishes to use the audio mode? If so, how long does that take? Does it have to be rebooted to turn off the audio mode and how long does that take?

 

8. When voters use the audio, does the screen always shut off so that voters who want to both see and hear at the same time cannot do so?

 

9. When voters use the hand-held device, does the screen always shut off so that voters who use the handheld device are forced to use the audio and cannot see the screen at the same time?

 

10. How would an Iraqi veteran who has lost both arms and all vision use this system?

 

11. Have you met with focus grous to test this machine with voters who are

      elderly

      have limited manual reach, strength, and dexterity

      blind

 

      If so, which groups? Assuming a ballot with 8 or 9 races and an average of 5

      candidates per race, how long would it take for a voter with limited manual reach,

      strength, and dexterity to vote? How long would it take for a blind voter?

      What suggestions for improvement did these different testers with disabilities make?

 

      If not, why not, since important provisions of HAVA are for voters with disabilities and

      that is a prime reason for changing our voting technology?

 

12. Did you have groups of blind voters involved in the design of what your audio text says? If so, which groups?

 

 

C. PAST PERFORMANCE

 

13. Can a voter selects straight party vote? If so, after the voter selects a straight party vote and then changes the selection for one race, are all the straight party votes cancelled?

 

14. Have you ever been sued for errors in ballot programming that your technicians did for a jurisdiction?

 

15. For how many elections have you had to replace the ballot programming provided by your technicians due to flaws in it?

 

 

D. TIMING

 

16. Given a ballot with 8 or 9 races, and an average of 5 candidates per race, approximately how long would it take to vote on this system?

How many voters would be able to use this system in an hour? In a 9-hour election day? (Two sample ballots are attached, Nov 2004, and Nov 2005.)

 

 

E. AUDITS

 

17. Would you allow your equipment to be demonstrated in a full public test involving

 

(a)  A mock election with at least 10 DREs and their central tabulator if they provide one, using a real ballot such as from the November 2004 election;  a "stress test" of the maximum number of voters that the system will ever have to handle on one election day; entry of all possible vote combinations; use of all devices including the touch screen or pushbuttons, all accessible devices, minority language interfaces, and the printer, showing the handling of overvotes, undervotes, straight party voting, modification of choices by the voter, second-chance voting, and all other capabilities; extraction of the end-of-day information from the system; and a complete audit of results and logs created by the DREs.

 

(b)  Examination by respected computer scientists such as Dr. Avi Rubin, Dr. David Dill, or Dr. Rebecca Mercuri, of all internal and external memory of all kinds including all files, programming, operating system code, and any other memory contents.

 

(c)  A "red test" in which skilled and knowledgeable professionals and activists attempt to subvert the system, posing as insiders as well as outside hackers.