The present invention relates to a distance measuring or range-finding apparatus. More particularly, the invention is concerned with a distance measuring apparatus which is adapted to continuously or successively measure a distance to an object such as a motor vehicle or the like.
Distance measuring apparatuses using image sensors are well known, for example, as disclosed in Japanese Patent Publication No. 63-46363. An illustrative structure of this known apparatus is shown in FIG. 5.
Referring to the FIGURE, the apparatus includes a pair of lenses 1, 2 representing right and left optical systems disposed with a horizontal distance therebetween which corresponds to a base length B. Disposed respectively at the rear sides of the lenses 1 and 2 are two-dimensional image sensors 3 and 4 each with a distance corresponding to the focal length of the associated lens. An object 5 is located in front of the lenses 1 and 2 at a distance R from a line or plane passing through the centers of the lenses 1, 2. A pair of analogue-to-digital (A/D) converters convert analogue signals input from the image sensors 3 and 4 to digital signals. A pair of memories 8, 9 store the digital signals or data input from the A/D converters 6 and 7, respectively. A central processing unit or CPU 10 processes the image data stored in the memories 8 and 9 to thereby determine the distance to the object 5.
In operation, the image of the object 5 is focused onto the image sensors 3 and 4 through the lenses 1 and 2, respectively. Image or picture signals resulting from photoelectric conversion of the light images sensed by the image sensors 3 and 4 are converted to digital signals or data through the respective A/D converters 6 and 7 to be subsequently stored in the memories 8 and 9. The CPU 10 processes the image data stored in the memories 8 and 9 for arithmetically determining the distance to the object 5.
The image data processing executed by the CPU 10 will be described below in some details. At a first step, the CPU 10 reads out from the memories 8 and 9 the image data in the form of picture elements or pixels located at the addresses corresponding to the leftmost top positions in the images as sensed by the image sensors 3 and 4, respectively, and arithmetically determine an absolute difference between these image data. Next, the CPU 10 reads out from the memories 8 and 9 the image data corresponding to the second or next leftmost top pixels (i.e., the pixels moved one pixel to the right from the leftmost top pixels), respectively, to arithmetically determine an absolute difference between these image data. Similar processing is performed sequentially by shifting the image data one pixel by one pixel for all the pixels located in the effective image areas of the memories 8 and 9. By summing the absolute differences thus determined, a first integrated value is determined.
Subsequently, the CPU 10 reads out from the memories 8 and 9 the image data of the pixel located at the leftmost top position in the image sensed by the image sensor 3 and that of the pixel located at the second or next leftmost top position in the image sensed by the image sensor 4, respectively, to thereby arithmetically determine an absolute value of the difference between these image data. Similar processings are successively performed by shifting the image data to the right one pixel by one pixel for all the pixels as sensed by the image sensors 3 and 4. The absolute differences thus determined are then summed up to provide a second integrated value.
In this manner, the CPU 10 arithmetically calculates the absolute values of differences between the pixel signals of the image sensor 4 and those of the image sensor 3 which are obtained for the pixels sequentially displaced or shifted one pixel by one pixel from those of the image sensor 4. Finally, the CPU 10 determines a minimum one of the integrated or summed values thus calculated. In that case, when the minimum value originates in the pixels of the image sensors 3 and 4 which are displaced from each other by a distance corresponding to a number n of the pixels, this means that the right and left images picked up by the sensors 3 and 4 are shifted or deviated by a distance corresponding to n pixels relative to the optical axis of the lens 1 or 2. In other words, by representing by p the pitch of the pixels in the image sensors 3, 4, the magnitude of the deviation or distance between the right and left images is given by (n.times.p), whereby the distance R to the object 5 can be determined by the triangulation method as follows: EQU R=(f.times.B)/(n.times.p) (1)
where B represents the base length between the centers of the lenses 1, 2; and f represents the focal length of the lenses 1, 2.
With the arrangement of the conventional distance measuring or range-finding apparatus as described above, an enormous amount of time is taken for arithmetically determining the distance to the object because comparisons of the image data (pixel signals) as mentioned above are performed for all the pixels of the entire image sensed by the image sensors 3 and 4. As a consequence, with the conventional distance measuring apparatus described above, it is very difficult or almost impossible in practice to apply or utilize the apparatus for an inter-vehicle distance alarm system, an automatic vehicle-following system or the like, in which the distance to a moving object such as a preceding vehicle has to be continuously or successively measured swiftly.