The present invention concerns an image capturing device and in particular such a device of a reading system or reader of optical information of the “imager” type.
Imager type readers of optical information are well known. Such readers comprise an image capturing device capable of capturing or acquiring the image of optical information present on a substrate of whatever kind, including a display on which the optical information is displayed in turn by whatever electrical or electronic device.
In the present description and in the attached claims, the expression “optical information” is used in its widest sense to include both one-dimensional, stacked and two-dimensional optical codes, in which information is encoded in the shapes, sizes, colours and/or reciprocal positions of elements of at least two distinct colours, and alphanumeric characters, signatures, logos, stamps, trademarks, labels, hand-written text and in general images, as well as combinations thereof, in particular present on pre-printed forms, and images containing features suitable for identifying and/or selecting an object based on its shape and/or volume.
In the present description and in the attached claims, the term “light” is used in its widest sense, indicating electromagnetic radiation of a wavelength or of a range of wavelengths not only in the visible spectrum, but also in the ultraviolet and infrared spectra. Terms such as “colour”, “optical”, “image” and “view” are also used in the same widest sense. In particular, the encoded information can be marked on a substrate in invisible ink, but sensitive to ultraviolet or infrared rays.
Imager type readers of optical information typically comprise, in addition to the image capturing device, devices having one or more different other functions, or are in communication therewith.
Among such further devices are mentioned herein: a device for processing the captured image, capable of extracting the information content from such an image or from a portion thereof; a memory device; a device or interface for communicating the acquired image and/or the extracted information content outside the reader; a device or interface for inputting configuration data for the reader, coming from an external source; a device for displaying to the user alphanumeric and/or graphical information relating for example to the operative state of the reader, the content of the information read, etc.; a device for manually inputting control signals and data; an internal device for supplying power, or for taking a power supply signal from the outside.
Moreover, among the further devices that can be included in or associated with an imager type optical information reader are mentioned herein: an aiming device that aids the operator in positioning the reader with respect to the optical information by displaying on the substrate a visual indication of the region framed by the image capturing device, for example its centre and/or at least part of its edges and/or corners; an aid device for correctly focussing the image capturing device (rangefinder), which displays on the substrate a luminous figure having variable shape, size and/or position between a focussed condition and an out-of-focus condition, and possibly indicative of the direction in which to mutually move the image capturing device and the substrate to reach the focussed condition; an outcome indication device, which displays on the substrate a luminous figure indicative of the positive or negative outcome, and possibly of the reasons for a negative outcome, of an attempt at capturing an image and/or decoding the optical information, through variations in shape, size, colour and/or position of the luminous figure; a device for detecting the presence of a substrate and/or for measuring or estimating the reading distance, namely the distance between a reference in the reader, in particular a sensor of the image capturing device, and the substrate. The functions of targeting and indicating focus can also be made together through the projection of a suitable luminous figure, for example a pair of inclined bars or a pair of crosses, respectively, that cross each other at their centres or superimpress to each other, respectively, at the centre of the region framed by the image capturing device only at the focused distance.
The measurement or estimate of the distance is typically used by the reader to activate the decoding algorithm only when the optical information is located at a distance comprised between the minimum and maximum working distance, and/or to control a zoom device and/or a device for automatically changing the focussing distance of the image capturing device (autofocus). Moreover, the measurement or estimate of the distance can be used in the case in which digital restoration of the image is necessary, since the degrading function, or the PSF (point spread function) of the optics of the image forming device, depends upon the reading distance. Furthermore, the measurement or estimate of the distance is necessary to calculate the volume of an object.
Devices for aiming and/or indicating focus are for example described in U.S. Pat. No. 5,949,057, U.S. Pat. No. 6,811,085, U.S. Pat. No. 7,392,951 B2, in U.S. Pat. No. 5,331,176, in U.S. Pat. No. 5,378,883 and in EP 1 466 292 B1.
Outcome indication devices are described, for example, in the aforementioned document U.S. Pat. No. 5,331,176 and in EP 1 128 315 A1.
It is worth emphasising that each of the functions of aiming, indication of the focus condition, outcome indication, detection of presence and measurement or estimate of the reading distance can be implemented in different ways that are per se well known and do not exploit the projection of light on the substrate. Purely as an example are quoted herein, for the aiming and/or the focus condition, viewfinders and displays of what is framed by the sensor; for the indication of outcome, sound indications and visual indications projected not on the substrate, rather towards the operator; for the detection of presence, measurement or estimate of the distance and/or evaluation of the focus condition, photocell systems, radar or ultrasound devices, etc.
An image capturing device of the imager type comprises an image forming device or section, comprising a sensor in the form of an ordered arrangement or array—linear or preferably of the matrix type—of photosensitive elements, capable of generating an electric signal from an optical signal, and typically also a receiver optics of the image, capable of forming an image of the substrate containing the optical information, or of a region thereof, on the sensor.
The image capturing device is characterised by an optical reception axis, which is defined by the centres of the elements of the receiver optics, or by the centres of curvature of the optical surfaces in the case of a single lens, and which defines its main working direction. The image capturing device is also characterised by a working space region, generally shaped like a frustum of pyramid, extending in front of the sensor. The working space region, in other words the region of space in which optical information is correctly framed by the sensor and the image of which is sufficiently focussed on the sensor, is usually characterised through a field of view, which expresses the angular width of the working region about the reception axis, and a depth of field, which expresses its size along the direction of the reception axis. The depth of field therefore expresses the range between the minimum and maximum useful distances, along the reception axis, between the reader and the region on the substrate framed by the sensor. The field of view can also be expressed in terms of “vertical” and “horizontal” field of view, in other words in terms of two angular sizes in planes passing through the reception axis and perpendicular to each other, to take due account of the shape factor of the sensor, or even, in the case of reception system without any symmetry, four angular sizes in half-planes 90° apart.
The working space region—and therefore the field of view and the depth of field—can be fixed or made dynamically variable in size and/or in proportions through well known zoom and/or autofocus systems, such as electromechanical, piezoelectric or electro-optical actuators for moving one or more lenses or diaphragms, mirrors or other components of the receiver optics or for moving the sensor, and/or for changing the curvature of one or more lenses of the receiver optics, such as liquid lenses or deformable lenses.
EP 1 764 835 A1 describes an optical sensor wherein each photosensitive element or group of photosensitive elements has an associated lens or other optical element, such as diaphragms, prismatic surfaces, light guides or gradient index lenses. Such a document is totally silent about the illumination of the region framed by the sensor.
Although image, capturing devices operating with ambient light only are well known, the image capturing device of the imager type typically further comprises an illumination device or section suitable for projecting one or more beams of light, possibly variable in intensity and/or spectral composition, towards the substrate carrying the optical information. The beam of light emitted by the illumination device, or the whole of the beams of light, defines an optical illumination axis, which is the average direction of such a single or composite light beam, being an axis of symmetry thereof in at least one plane and typically in two perpendicular planes in the case of a two-dimensional array.
For correct operation of the image capturing device, the illumination device must be able to illuminate the entire working space region of the image forming device.
An image capturing device wherein, as illustrated in FIG. 1—and which is analogous to that of FIG. 4 of U.S. Pat. No. 5,378,883 referred to above—, the illumination device 90 is not coaxial with the image forming device 91, rather is arranged alongside the image forming device 91 and configured so that the illumination axis 92 of the illumination beam 93 and the reception axis 94 converge, is subject to an intrinsic parallax error and to an intrinsic perspective distortion error in the two-dimensional case. Such errors make the intersection between the substrate S and the illumination beam 93 and the intersection between the substrate S and the working space region 95 of the image forming device 91 substantially concentric at most in a very small range of reading distances (about the distance where the substrate S is partly indicated in FIG. 1). Consequently, in order to that the illumination device 90 is able to illuminate the entire working space region 95 of the image forming device 91, at most of the reading distances the illumination is overabundant (cfr. the distances where the substrate S1 or the substrate S2 is partly indicated in FIG. 1), in other words the illumination extends outside of the region framed by the sensor on the substrate, with consequent waste of energy.
In some devices for capturing images of the prior art, the parallax error is solved by making the illumination device coaxial to the image forming device.
U.S. Pat. No. 5,319,182 describes an image capturing device, not of the imager type but rather of the scanning type, wherein the illumination device and the sensor are overall coaxial, in that they consist of a matrix in which emitters with programmable activation alternate with the photosensitive elements of the sensor. This device is potentially very compact and flexible, but it is also subject to remarkable problems of optical insulation between the emitters and the photosensitive elements: even by providing for an insulator between them as suggested in the document, the light emitted by the emitters and reflected, even to a minimal extent, onto the photosensitive elements by any surface, such as an opaque dividing wall or the rear surface of a projection optics with anti-reflection treatment, is of much higher intensity than that received from the substrate carrying the optical information. Moreover, laying out on a single substrate photosensitive elements and photo-emitting elements leads to compromises in terms of efficiency since the required characteristics of the material in order to have efficient photo-emitting elements are the opposite to those required to obtain efficient photosensitive elements.
In U.S. Pat. No. 5,430,286 the coaxiality between the light emitted by the illumination device and the image forming device is obtained through a beam splitter. As a result there are a very large space occupied in the reader and a very low efficiency, due to the loss of 50% of power both along the illumination path and along the reception path.
Such a system, also suffering from problems of occupied space, is described in the aforementioned U.S. Pat. No. 5,331,176, which uses a semi-transparent mirror instead of the beam splitter. Such a document also teaches to adjust the size of the section of the illumination beam, but through mechanical moving devices that contribute to the occupied space and to the consumption of the reader. Moreover, such a solution does not avoid the drawback of wasting energy for illumination, since a portion of the illumination beam is merely obscured.
US 2007/0158427 A1, which represents the closest prior art, in FIG. 5B describes an illumination system comprising a pair of illumination arrays each arranged on opposite sides of the sensor and associated with the greater working distances, and a pair of illumination arrays, also each arranged at said opposite sides of the sensor and associated with the smaller distances. Since the section of the light beam overall emitted by the pair of arrays associated with the greater working distances is oriented and sized to uniformly illuminate the entire region framed by the sensor at least at the maximum distance, it follows that at such a distance and at the shorter reading distances the illumination by such arrays is overabundant, in other words it extends outside of the region framed by the sensor. This kind of drawback occurs with regard to the pair of arrays associated with the smaller working distances. The device of such a document is therefore scarcely efficient, in particular scarcely suitable for battery-powered portable readers, where energy saving is an important requirement. The document also teaches to switch on only one array of each pair to avoid problems of reflection from the substrate, therefore falling into the case of a system subject to parallax and perspective distortion errors, or to switch on both of the pairs of arrays when the reading distance is unknown. The document further describes a further pair of illuminators, each arranged at the other two sides of the sensor, to illuminate a thin line for reading one-dimensional codes, and four illuminators for aiming a region of interest, arranged at the vertexes of the sensor.
The technical problem at the basis of the invention is to provide an efficient image capturing device, and more specifically such a device of an imager type reader of optical information, which in particular is free from parallax error, still without providing overabundant illumination, extending outside of the region framed by the sensor, and which avoids any possibility of optical interference between light sources and photosensitive elements.
In a first aspect thereof, the invention concerns an image capturing device of the imager type, comprising:                an image forming device including a sensor including a one-dimensional or two-dimensional array of photosensitive elements and defining an optical reception axis, at least one reading distance, and a region framed by the sensor on a substrate at said at least one reading distance,        an illumination device including an array or array of adjacent light sources, defining an optical illumination axis,characterised:        in that the light sources are individually drivable and each light source is adapted to illuminate an area of a size much smaller than the size of said region framed by the sensor,        in that the illumination axis does not coincide with the reception axis,        by comprising a driver of the light sources adapted to drive the light sources so as to switch off at least the light sources that illuminate outside of the boundary of the region framed by the sensor on the substrate at said at least one reading distance.        
In the present description and in the attached claims, the term “optical reception axis” is meant to indicate the direction defined by the centres of the elements of the receiver optics, or by the centres of curvature of the optical surfaces in the case of a single lens.
In the present description and in the attached claims, the term “optical illumination axis” is meant to indicate the average direction of the maximum illumination beam that would be emitted by the illumination device if all of the light sources of the array were switched on—apart from a possible different angular blur of the sources at opposite extremes of the array.
It should be noted that in the present description and in the claims the term “axis” is used for the sake of simplicity, although in practice in both cases it is a half-axis.
In the present description and in the attached claims, under “adjacent” it is meant to indicate that between the light sources there are no components having different functions from the light emitting function and/or from a function slaved to this, like for example addressing, driving, heat dissipation, optical insulation of the light sources; such a term must not therefore be construed in a limiting sense to indicate that the light sources are in contact with each other.
In the present description and in the attached claims, under “boundary” of the region framed by the sensor on the substrate it is meant to indicate a line having a thickness equal at most to the region illuminated by an individual light source of the array. In other words, the terminology takes into account the fact that the light sources are in any case finite in number, and that every light source illuminates a region having a finite size, thus dictating a resolution limit of the illumination system with respect to the geometric boundary of the region framed by the sensor.
Each individually drivable light source preferably comprises an individual illuminating element, but it could comprise more than one.
Preferably, said at least one reading distance comprises a plurality of reading distances within a depth of field, in other words a plurality of reading distances between the minimum reading distance and the maximum reading distance inclusive.
The reading distances at which the driver is adapted to drive the light sources so as to switch off at least the light sources that illuminate outside of the boundary of the region framed by the sensor on the substrate can be discrete from one another, or variable with continuity within the depth of field.
Typically, in order to increase the depth of field and/or to better define the direction and/or the shape in space of the region framed by the sensor, the image forming device further comprises at least one receiver optics, with fixed or variable focal length. Such a receiver optics can in particular comprise a single lens or optical group shared by the photosensitive elements of the sensor and/or an array of lenses, prismatic surfaces and/or diaphragms each associated with a photosensitive element or sub-group of elements, for example as described in the aforementioned EP 1 764 835 A1.
Typically, the image forming device comprises a zoom and/or autofocus system, in which case the region framed by the sensor is variable in a way not directly proportional to the reading distance within the depth of field.
The reception axis can coincide with the normal to the plane of the sensor or be inclined with respect to it by an angle.
Preferably, in order to increase the focal depth on the image side and/or to incline the illumination axis with respect to the normal to the array of light sources, the latter is associated with at least one projection lens. More specifically, each light source can be provided with its own projection lens, and/or at least one single projection lens can be provided, shared by the light sources of the array.
Each projection lens can be replaced by or associated with other optical elements, such as diaphragms, prismatic surfaces, light guides and/or gradient index lenses, in an analogous way to what is described in the aforementioned EP 1 764 835 A1.
The illumination axis can coincide with the normal to the plane of the array or be inclined with respect to it by an angle.
In some embodiments, the illumination axis is parallel to and spaced from the reception axis.
In other embodiments, the illumination axis is inclined and not coplanar with respect to the reception axis. In the case in which the two axes are inclined, they can intersect, generally in front of the sensor, or else they can be oblique.
In some embodiments, the array and the sensor are coplanar, so that they can advantageously be made on a same support, on a same integrated circuit board, or be made on a same integrated circuit substrate.
In other embodiments, the array and the sensor are arranged on planes inclined to each another, so that advantageously the angle of inclination between the illumination axis and the reception axis is determined or is contributed to being determined.
Preferably, the light sources of the array are adapted to overall illuminate, if all of them were switched on, a larger area than the maximum region framed by the sensor within the depth of field.
More specifically, the number of light sources is selected so that the area overall illuminated on the substrate by the illumination device undergoes a sufficiently small percentage change when a single light source is switched on/off.
Preferably, the percentage change is less than or equal to 15%, more preferably less than or equal to 10%, even more preferably less than or equal to 5%.
Preferably, the driver is adapted so as not to switch on all of the light sources of the array at any reading distance.
More preferably, the driver is adapted to switch off at least one light source at an edge of the array at each reading distance. In other words, the driver is adapted so as not to switch on both of the light sources arranged at opposite extremes of the array at any reading distance.
Preferably, the driver is adapted to switch off all of the light sources that illuminate outside of the boundary of the region framed by the sensor at the reading distance, and to switch on all of the sources that illuminate within the boundary of the region framed by the sensor in an operating mode.
Preferably, the driver is adapted to switch on only the light sources that illuminate at least one region of interest within the region framed by the sensor in an operating mode.
The driver can respond to a measurer of, or device for estimating, the reading distance.
The measurer of the reading distance can be a distinct device from the reader and in communication with it, for example a system of photocells, a device based on the measurement of the phase or of the time of flight of a laser or LED beam, visible or IR, or of the radar or ultrasound type, etc.
Preferably, however, the driver is adapted to switch on light sources of the array selected to project a luminous figure for evaluating the reading distance in an operating mode. The reading distance is measured or estimated based on the shape and/or position of the image formed on the sensor by the light emitted by said at least some of the light sources of the array.
The driver can be adapted to switch on light sources of the array selected to overall illuminate a luminous figure for aiming the region framed by the sensor and/or at least one region of interest thereof in an operating mode.
The driver can be adapted to switch on light sources of the array selected to overall illuminate a luminous figure for indicating an outcome of an attempt at capturing an image within the region framed by the sensor in an operating mode.
The light sources of the array are preferably individually drivable also in the intensity of emission.
Preferably, the array of light sources is suitable for emitting light of more than one wavelength. In particular, the array can comprise a first sub-plurality of light sources suitable for emitting at a first wavelength and at least one second sub-plurality of light sources suitable for emitting at a different wavelength from the first wavelength. Alternatively, each light source can be suitable for selectively emitting light of different wavelengths.
With such a provision it is for example possible to adjust the colour of the illumination based on the colour of an optical code and its background. Moreover, it is possible to easily provide a diversified indication of outcome of the capture or reading attempt, for example by projecting a green luminous figure for a positive outcome and a red luminous figure for a negative outcome. Furthermore, it is possible to diversify the luminous figures for aiming plural regions of interest, also for the sake of their selection by the user.
The array of light sources can be one-dimensional or two-dimensional.
The array of light sources can be flat or curved. By arranging the light sources on a curved surface it is possible to make the lengths of the optical paths between each light source and the substrate the same or substantially the same, therefore compensating for the different attenuation that the light emitted by the light sources would undergo in the case of a flat array, and therefore obtaining illumination of uniform intensity at the reading distance. A curved arrangement can also be used to determine or contribute to determining the divergence of the illumination beams of the various light sources.
Preferably, the number of light sources of the array is greater than or equal to 32 in the one-dimensional case, or 32×32 in the two-dimensional case, respectively.
More preferably, the number of light sources of the two-dimensional array is selected from the group consisting of 32×32, 64×64, 44×32 and 86×64, and in the one-dimensional case it is selected from the group consisting of 32 and 64.
In an embodiment the driver is adapted to switch off at least all of the sources that illuminate outside of the boundary of a first half of the region framed by the sensor at the reading distance, the image capturing device further comprising a second array of individually drivable, adjacent light sources, defining a second illumination axis, the second illumination axis not coinciding with the reception axis, and the driver of the light sources being adapted to drive the light sources of the second array so as to switch off at least the light sources that illuminate outside of the boundary of a second half of the region framed by the sensor complement to the first half.
In an embodiment, the image capturing device further comprises a second array of individually drivable, adjacent light sources, defining a second illumination axis, the second illumination axis not coinciding with the reception axis, and the driver of the light sources being adapted to drive the light sources of the second array so as to switch off at least the light sources that illuminate outside of the boundary of the region framed by the sensor.
In an embodiment the driver is adapted to run-time determine which light sources of the array to switch on or off, respectively, as a function at least of the reading distance.
In embodiments, the run-time determining is carried out through an analytical method, in other words making use of analytical formulae that depend only upon known (design) geometric parameters of the reader, and in particular of its image forming device, of its illumination device and/or of their relative spatial arrangements, including the relative spatial arrangement of their components or subassemblies.
Preferably, the analytical method comprises the steps of:                in a first reference system associated with the reception device, calculating the coordinates of peculiar points of the region framed on the substrate by the sensor;        carrying out a transformation of coordinates into a second reference system associated with the illumination device; and        in the second reference system, calculating the light sources of the array that illuminate corresponding peculiar points.        
Preferably, in the aforementioned steps one or more of the formulae from (1) to (31) described below are implemented.
In embodiments, the run-time determining is carried out at least in part through an empirical or adaptive method, comprising, in a recursive manner, driving so as to switch on a subset of light sources, evaluating the position and/or extent of the illuminated area on the substrate with respect to the region framed by the sensor, and adapting the subset of light sources based on such an evaluation.
The initial subset of light sources can be determined in advance in an analytical manner, the empirical or adaptive method thus being used for example to correct imprecisions of the array of light sources of each image capturing device of a production batch.
In embodiments, said recursive adaptation of the subset of light sources to be switched on is carried out along a plurality of radially spaced directions.
In embodiments, the subset of light sources to be switched on is determined by an interpolation of the positions of the extreme light sources to be switched on along said plurality of directions.
In an alternative embodiment, the driver is adapted to determine which light sources to switch on or off, respectively, as a function of the reading distance by reading them from a look-up table.
The driver can be adapted to build one-off (una tantum) said look-up table, in particular with analytical or empirical/adaptive method, similarly to the run-time determining.
Alternatively, the driver can be adapted to receive as an input said look-up table, one-off built by a separate processing device, with analytical or empirical/adaptive method, similarly to the run-time determining.
Should the determining of the light sources to be switched on or off, respectively, as a function of the reading distance one-off occur in a separate processing device, it is preferably implemented by a computer program that parametrically manages one or more quantities of the image capturing device. In this way, advantageously the same computer program can be used for example for a range of reader models.
Such a computer program represents a further aspect of the invention.
The light sources of the array are preferably of the solid state type or are organic, and more preferably they are selected from the group comprising LEDs, OLEDs, microLEDs and microlasers.
In another aspect thereof, the invention concerns an imager type reader of optical information comprising an image capturing device as described above.
In another aspect thereof, the invention concerns a computer readable memory means comprising the aforementioned program.
In another aspect thereof, the invention concerns an optical reader comprising an array of individually drivable, adjacent light sources, and a driver adapted to drive the light sources of the array in an illumination mode, an aiming mode, and a reading outcome indication mode.
Preferably, said driver is also adapted to drive the light sources in an optical distance measurement system or measurer mode.