Scanning technology is used to convert paper or other documents into digital documents, helping to reduce the amount of paper created in a typical business environment and to increase the speed at which business can be conducted. A scanner comprises a plurality of photosensors, typically in the form of a linear array, that move relative to an image on a sheet of paper. As the linear array of photosensors moves across the image, each photosensor outputs a series of signals related to the intensity of reflected light from the small area of the image focused on by a given photosensor at a given moment. These output signals are then sampled and collected, in a manner known in the art, and used to generate digital image data.
The responsivity of a particular photosensor, and, by extension, an entire scanner, is defined by the relationship between the intensity of light impinging on the photosensor and the resulting value, typically in the form of a voltage, or a digital “gray-scale” value derived from the voltage, of the output signal. If the responsivity is considered in the form a graph in which increasing light intensity forms the x-axis and the output voltage forms the y-axis, the gain of the photosensor is the slope of a linear relationship, while the offset is indicated by the y-axis intercept, thus indicating the voltage output of the photosensor at zero light intensity.
In the practical, day-to-day use of a scanner, the responsivity of each of the photosensors changes over time, and the changes in the system must be compensated for periodically, in order to ensure consistent output from the scanner. Common sources of long-term performance variation over time of a scanner include the declining intensity of the internal light source in the scanner, and the general fading of test surfaces from their original preset reflectivities.
The periodic compensation of the system for changes in responsivity is known as “calibration” of the scanner. The calibration step is typically carried out at the power-up of the scanner, and generally comprises the steps of having the photosensors in the array be exposed to two test strips built into the scanner: a black test strip for setting the offset, and a white test strip of a predetermined reflectivity for setting the gain, or a single white strip for white calibration and the illumination off for black calibration. These test strips are often incorporated into the structure of the scanner. The test strips are of sufficient width so that each photosensor moving across the strips will be able to measure the reflectivity of a plurality of small, pixel-size regions in each strip. For each of the black strip (for analysis with the illumination off) and the white strip, the calibration system measures reflectivity values for a number of small pixel-size regions and averages the readings in order to “smooth out” any small variations in the reflectivity (either black or white) of a given test strip. In practical use however, calibration methods that merely compute an average of a set of pixel-size regions on a test strip fail to take into account the possible presence of large anomalies in signals from the test strips, which may occur due to accumulation of specks of dust, dirt or hair on the strips.
Besides the need for calibration, various scanner sensors including those within sensor arrays may respond to power supply voltage differently and sensor output voltage ranges may vary from sensor to sensor. Also, variations in sensors generally exist due to process variations during the manufacturing. The light sources, such as light emitting diodes (LEDs), employed in scanners often vary in their performance due to process variations as well.
There is, therefore, a need for methods and systems that can optimally control the light output of a light emitting diode (LED) in a scanner based upon sensor sensitivity and LED illumination, thereby correcting for process variations and other factors that may adversely affect exposure of the image being scanned. This allows the selection of the optimum light output for a particular sensor response to give the best range and signal to noise characteristics. There is further a need for a calibration method for setting the gain and offset of the photosensors in a scanner, which takes into account and discounts the anomalies in signals from test strips.