In various applications, imaging devices are arranged to generate images of markings (letters, symbols, graphics, photographs, and so on) that they detect on a substrate while relative motion occurs between the substrate and a sensing unit in the imaging device. For instance, some printing devices include an optical scanner to scan the images that have been printed and this scanning is performed, for example, for quality assurance purposes and/or for the purpose of diagnosing defects or malfunctions affecting components of the printing device. In some cases the substrate is transported past a stationary sensing unit of the imaging device so that an image can be generated of the markings on the whole of the substrate (or on a selected portion of the substrate), and in some other cases the substrate is stationary and the sensing unit of the imaging device is transported relative to the substrate. The sensing unit may take any convenient form, for example it may employ TDI (time delay integration) devices, charge-coupled devices, contact image sensors, cameras, and so on.
In some applications a digital representation of a target image is supplied to a printing device, the printing device prints the target image on a substrate and then the target image on the substrate is scanned by an imaging device included in or associated with the printing device. The scan image generated by the imaging device may then be compared with the original digital representation for various purposes, for example: to detect defects in the operation of the printer, for calibration purposes, and so on.
In some cases the imaging device has a sensing unit that senses markings on a whole strip or line across the whole width of the substrate at the same time, and generates a line image representing those markings, then senses markings on successive lines across the substrate in successive time periods: here such a sensing unit shall be referred to as an in-line sensing unit. For example, an in-line sensing unit may include an array of contiguous sensing elements that, in combination, span the whole width of the substrate. A simple form of in-line sensing device includes a one-dimensional array of sensing elements. However, in certain technologies—for example TDI—plural rows of sensors may be provided and the line image may then be produced by averaging (to reduce noise). The number of sensing elements in the array, and the exposure time over which each sensing element/array integrates its input to produce its output, may be varied depending on the requirements of the application. A clock pulse generator may be used to synchronize the measurement timing of the in-line sensing unit so that in each of a series of successive periods (called either “detection periods” or “scan periods” below) the sensing unit generates an image of a respective line across the substrate.
Such an imaging device may include a processor that is arranged to process the signals output by the in-line sensing unit to create a two-dimensional scan image of the markings on the substrate by positioning the sensing-unit output measured at each detection time along a line at a spatial location, in the scan image, which corresponds to the detection time (taking into account the speed and direction of the relative displacement between the substrate and the in-line sensing unit). The duration of each detection period may be very short, and the interval between successive detection periods may also very short, so that in a brief period of time the imaging device can construct a scan image that appears to the naked eye to be continuous in space (i.e. a viewer of the scan image cannot see the constituent lines).
If the relative motion between the substrate and the in-line sensing unit occurs at a constant linear velocity in the lengthwise direction of the substrate then the positions on the substrate that are imaged by the in-line sensing unit at successive detection times are disposed along parallel lines that are spaced apart by equal distances in the lengthwise direction of the substrate and the processor generates a scan image in which the sets of points imaged in the successive detection periods are still disposed along lines that are parallel to each other and are spaced apart by equal distances in the lengthwise direction of the scan image.
However, in practice, even in devices that are designed to employ constant-velocity linear relative displacement between an image-sensing unit and a substrate (for example, in the lengthwise direction of the substrate), the direction and magnitude of the relative displacement tends to deviate from the nominal settings, for example: because the substrate position may be skewed at an angle compared to the nominal position, because a mechanism that transports the substrate (or the sensing device) during imaging may have defects that produce variations in the direction and magnitude of the motion, and so on. Thus, the magnitude and direction of the relative motion between a substrate and an in-line sensing unit may change between successive detection periods when the sensing unit detects markings on the substrate. As a consequence, distortion can occur between the actual markings on the substrate and the markings as they appear in the scan image produced by the imaging device.
Imaging devices have been proposed that implement routines to estimate what is the actual velocity of a relative displacement that takes place between a substrate and a sensing unit of the image device, at different time points during an imaging process. Here we shall refer to the relative displacement velocity as “page velocity” irrespective of the form of the substrate (i.e. irrespective of whether the substrate takes the form of an individual sheet or page or some other form, e.g. a continuous or semi-continuous web), and irrespective of which element moves during the imaging process (i.e. irrespective of whether the substrate is transported past a stationary sensing device, whether the sensing device is moved past a stationary substrate, or whether the relative motion is produced by some combined motion of the substrate and sensing device). Estimation of page velocity may involve: estimating the direction and magnitude of a rotation in the plane of the substrate, estimating coordinates of the rotation centre of such a rotation, and estimating the velocity of translational motion (for example, estimating translational velocity in the nominal direction of the relative displacement between the sensing device and the page, and in a second direction perpendicular to the first direction).
Some page velocity estimation routines employ optical flow techniques. One step in the page velocity estimation routine may involve determining the registration between positions of pixels in the scan image and the positions on the substrate that were imaged to produce the scan image data. This step of determining the registration between the scan image and the actual markings on the substrate may involve processing the scan image data to determine how the patterns of intensities of pixels vary along different straight lines in the scan image plane and then processing a digital representation of the target image on the substrate so as to locate, in the digital representation, the positions of pixels having these same patterns of intensities. By matching the patterns of intensities, it becomes possible to determine the relationships between positions of pixels in the scan image and the corresponding points on the substrate which were imaged to generate those pixels. Estimates of the page velocity in translation and rotation may then be calculated using the determined relationships.