In many applications of image capturing devices, it is necessary to establish an accurate relationship between the actual positions of a number of feature points of the actual object and the coordinates of the corresponding points of the captured image of the object. By doing so, it becomes possible to accurately identify the position of such points of the object by analyzing the captured image. This is highly important, for instance, in robotic applications where an object is required to be manipulated or otherwise acted upon according to the information gathered by capturing the image of the object.
The factors that prevent a predictable correspondence between the object and captured image include errors in the mechanical arrangement of the optical elements of the image capturing device, optical distortions that are intrinsic to the optical system of the image capturing device, optical distortions of the transparent shield which is placed in front of the image capturing device (diffractive aberration), and irregularities in the electronic imaging device that is used in the image capturing device, among other possibilities. It is known to capture the image of a grid pattern with the image capturing device, and compare the captured image with the original grid pattern so that the distortion caused by the image capturing device may be evaluated and this data may be used for calibrating the image capturing device. See Japanese patent laid-open publication No. 11-355813A.
The calibration can be in the form of a lookup table. However, a high level of resolution is normally required, and the lookup table requires a correspondingly large storage space. Alternatively, the-calibration can be effected by using high order mathematical functions for approximation. However, the handling of high order mathematical functions requires a large amount of computation for implementation. Therefore, this prior art is unsuitable for simple systems having a limited storage capacity and a limited computational capacity, and, more importantly, unsuitable for real-time applications that require prompt responses.
In a video camera for electronically recording or reproducing a color image of an object, typically, a color separation device is placed behind the lens for separating the image into the three basic colors of R (red), G (green) and B (blue), and the image signal for each of these basic colors is converted into an image signal before it is combined with the images signals of the other colors. The image signal for each color may be individually processed for a desired effect, and the combined image signals allow the color image to be stored and reproduced as required.
There are a number of ways to color separate an image. For instance, a dichroic prism having a number of reflective surfaces each consisting of a multi-layer optical interference film may be used for this purpose. Each reflective surface selectively reflects light of a prescribed color so that color separated images may be formed on separate electronic imaging devices such as CCD panels.
When a transparent shield is placed in front of such a camera, the shield may diffract different colors of the incident light differently, and various points of an object may be associated with the corresponding points on the imaging device differently depending on the color of each particular point. This can be considered as one form of chromatic aberration. In other words, when identifying the actual spatial position of each point on the object from the position of the corresponding point on the imaging device, some error may occur depending on the way the light passing through the shield is diffracted.
Japanese patent laid open publication No. 2-205187A discloses the use of separate chromatic aberration compensation circuits for different ones of the basic RGB color signals for the purpose of canceling the influences of chromatic aberration. For such compensation circuits to be effective in compensating chromatic aberration, it is necessary to compare each particular point on the object and the position of the corresponding point on the imaging device for calibration. Conventionally, such a calibration was executed by using an incandescent lamp that emits white light containing a wide range of wavelengths.
However, RGB colors that are color separated from the white light generally do not have a same energy level, and the amplification gains for the signals from the imaging devices such as CCD panels are required to be adjusted individually for each different basic color of the white light. Therefore, the calibration process tends to be both complex and time consuming.