To test scan quality we used a printed copy of Henry David Thoreau's "On the Duty of Civil Disobedience," multiple handwritten, filled out forms like W2's and 1040's, multiple receipts from obtained at different businesses (and thus from different receipt printers), driver's licenses, and, for good measure, a full color photo. We then scanned all of these documents using all of our scanners on their standard settings, and compared the quality of the resulting PDF documents side-by-side.
To test speed we used a 10-page, double-sided test document. We ran that through each of our scanners three times, starting a stopwatch the second we started scanning and stopping it the second a PDF document appeared on our hard drive. We then averaged these three times together to get an average time, and from there calculated an average pages per minute figure.
Our software testing was split into two categories: general interface and optical character recognition (OCR). To test the intuitiveness of each interface we set a series of tasks (create a scan, move the resulting PDF, adjust scan settings, etc.), had everyone in the office complete those tasks, and then had them give a grade to the intuitiveness of each software package. To test the OCR aspect of the software we scanned the same test document on each scanner, then searched for the same set of keywords within each resulting PDF, noting when words were missed.
Our user friendliness testing largely focused on the initial setup of each device, as this is where most annoyances tend to arise. We timed how long it took to get each model setup and talking to both a Mac and PC platform. We also took note of any annoyance that arose while using the scanners, namely paper jams or the lack of a paper tray that led to spent papers flying every which way.