The present invention relates to a method for preventing collisions between vehicles by correcting a possible error in the results of a detected distance, or a distance-related index value, detected from an image within the visual field of an image detection module including a pair of image sensing devices, each of which includes a plurality of image sensors as well as an optical means, wherein the error is caused by improper assembly of the image detection module.
Various techniques have previously been developed for accurately detecting the distance to a target as a subject, based on the parallax between a pair of images of the target captured by a pair of image sensors.
The conventional distance detection methods using this principle of triangulation are characterized by their passive nature, and have been used in auto-focusing cameras. These methods are now being applied in preventing collisions between vehicles.
In the above-noted triangulation technique, a module including a pair of image sensors and an optical device, such as a lens, is used to detect the images of a target. In applications involving auto-focusing cameras, however, these methods are guaranteed to identify a detection target through a finder, while in applications involving the prevention of a collision, the target may not always be located in front of the module, e.g., a preceding vehicle may be located at an unspecified lateral angle and must first be found before the distance to it can be determined. Thus, a common package, as an image detection module including a pair of integrated circuit chips, each comprising a plurality of image sensors, and an optical device, is used, and the visual field used for capturing the images containing a target, is relatively larger than that in auto-focusing cameras. For example, the image sensor may be a CCD comprising several hundred or more optical sensors.
In such cases, in order to identify that region within the visual field in which a target to be detected exists, it is most practical to divide this wide visual field into a plurality of windows, or sub- visual-fields disposed in a two-dimensional matrix, to detect the parallax between the distances for a pair of images captured by the image sensors for each window, to select the most reliable range of the distance to the target based on a distribution of a plurality of detected distances, and to identify that region within the visual field in which the windows corresponding to this range are collected, as a region in which the target is present.
In order to divide the visual field produced by these multiple pairs of image sensors into windows disposed like a matrix, a plurality of windows may be set and disposed within the sub-visual-field of each pair of image sensors. To set a window, the window-part data corresponding to each window may be extracted from image data, including a multiplicity of sensor data, representing an image pattern captured by each image sensor in such a way that the window-part data includes several tens of sensor data items. To detect a parallax between the pair of images within a window, one pair of window-part data points may be shifted to each other while checking how they match, and when a match is detected, the corresponding shift value may be assumed to be a parallax expressed as a number of sensor data.
As is well known, when the parallax is referred to as .sigma. and the array pitch of the optical sensors within an image sensor is referred to as (h), the distance (d) for the image in each window can be calculated using a simple equation "d=bf/h.sigma.", wherein (b) is the base length in triangulation, which is the distance between the optical axes of a pair of lenses in the optical means which form an image on the pair of image sensors, and wherein (f) is their focal distance. In this equation, bf/h is a constant, and the parallax a is directly used as an index for the distance (d).
As described in the preceding section, a visual field may contain a region in which a target is present and the distance to the target can be detected from either direct detection of the distance (d), or from the parallax .sigma. which is an index for each window formed by dividing the visual field produced by multiple pairs of image sensors. However, the incorporation of a pair of integrated circuit chips into the image detection module leads to unavoidable assembly errors involving the image sensors and the optical device, resulting in a small error in the image captured in each window. If there is an error in the image in each window, pairs of window-part data do not match. In this case, instead of checking the degree of coincidence between a pair, the correlations between them can be checked to assume the shift value when the best correlations is obtained as the parallax .sigma.. In this case, however, an error unavoidably occurs in the value of the parallax .sigma. obtained, or in the distance (d), depending on the position of the window within the visual field, as long as there is an error in the assembly of the module, thereby adversely affecting the accuracy of the estimated range to the target or the detected distance.
In view of the foregoing, it is an object of this invention to solve these problems and to correct detected values so as to consistently obtain an accurate parallax or the distance for each window set within the visual field, despite any error in the image detection module assembly.