a) Field of the Invention
The present invention relates generally to a position measuring apparatus applicable to a mobile body such as an automotive vehicle. The present invention, more particularly, relates to a technique which detects a target object (an obstacle) placed in a forward direction (forward detection zone) or rearward direction (rearward detection zone) of the mobile body on which the position measuring apparatus is mounted using a stereophotographic image processing method, which accurately determines a spatial distance to the obstacle and a position of each lateral end of the obstacle without influence of an ambient environment, and which can be applied to an obstacle detecting apparatus for the automotive vehicle, an automatic vehicular velocity controlling apparatus to follow up a preceding vehicle which is running ahead of the vehicle at the same traffic lane, a brake controlling apparatus to avoid a collision against the obstacle, an automatic vehicular steering apparatus, and so forth.
b) Description of the Related Art
A Japanese Patent Application First Publication No. Heisei 8-278126 published on Oct. 22, 1996 exemplifies previously proposed distance measuring apparatus and method applied to an automotive vehicle which detects the obstacle. In the previously proposed distance measuring apparatus and method disclosed in the above-identified Japanese Patent Application Publication, an object to be detected is the preceding vehicle.
The preceding vehicle is usually present over a road surface and many automotive vehicles have long lateral (horizontal) edges on their bodies. Hence, a restriction of a range of search for the object to be detected is placed onto a zone of image above the road surface detected by means of a white line detection (the white line serves to partition traffic lanes). A position of the preceding vehicle on an image is, then, determined by searching for any lateral edges above the road surface and a stereophotographic matching is, thereafter, advanced at the determined position described above to derive a distance to the preceding vehicle.
However, in the previously proposed method for detecting the obstacle disclosed in the above-described Japanese Patent Application First Publication, a zone of detection for the preceding vehicle is restricted onto above a (front) traffic lane on which the vehicle (on which the above-descried distance measuring apparatus and method are installed) is running and which is detected on the basis of the white line. Hence, the zone above the road surface to detect the preceding vehicle cannot be restricted in a case where the preceding vehicle is running on the traffic lane having no white line or in a case where the preceding vehicle is running on the traffic lane at a close distance to the vehicle so that the white line is partially or wholly hidden. Under such a situation as described above, it becomes impossible to restrict the search range for horizontal (lateral) edges of the preceding vehicle and the detection of the image of the preceding vehicle becomes unstable.
In addition, in the method for detecting the obstacle disclosed in the above-described Japanese Patent Application First Publication, a search for vertical (longitudinal) edges above the road surface is advanced. The stereophotographic matching is carried out using the vertical edge to detect the distance to the preceding vehicle and to detect the position of the preceding vehicle on the image. Then, a position of each horizontal (lateral) end of the preceding vehicle is determined according to an estimation from each vertical end edge present above the road surface. Hence, it is difficult to distinguish each horizontal end of the preceding vehicle from an end of any other objects than the preceding vehicle and its background. Consequently, an accuracy of the detection of the lateral (horizontal) ends of the preceding vehicle is lowered.
It is, therefore, an object to provide a position measuring apparatus which can detect the position of the obstacle with no influence of the ambient environment such as the white line and the position of the vehicle on which the position measuring apparatus is mounted, and can determine the position of the lateral ends of the obstacle to be detected with a high accuracy.
The above-described object can be achieved by providing a position measuring apparatus, comprising: a pair of electronic cameras mounted on a mobile body, optical axes of both of the cameras being mutually parallel to each other and being directed toward a forward direction of the mobile body or toward a rearward direction thereof and horizontal axes of their photograph planes being aligned to the same line; an image split region setting section to split one image photographed by one of the pair of cameras into a plurality of image split regions, each image split region having a predetermined dimension; a parallax calculating section to determine, for each image split region, a similarity in image between an image present within one of the image split regions and another image photographed by the other of the pair of cameras and present on each position corresponding to one of the image split regions and to calculate, for each image split region, a difference in position between a position of each of the image split regions and any one of the positions of the other image at which the similarity in image to the image on the position of each image split region is highest, the parallax calculating section deriving a parallax for each image split region from the calculated difference in position therebetween; a voting section, the voting section including a table whose lateral axis is made correspondence to a horizontal position for each image split region and whose longitudinal axis is made correspondence to a value of the parallax for each image split region and accumulatively adding a predetermined numerical value to each position in the table corresponding to both of the horizontal position of each image split region and a value of the parallax at the corresponding image split region of the same horizontal position so as to vote to each position in the table; a distance calculating section to calculate a distance from a camera setting position of the pair of cameras to a target object to be detected present in either the forward direction or the rearward direction on the basis of the value of the parallax at any one of the horizontal positions in the table at which a result of voting by the voting section indicates a value higher than a first threshold value; a horizontal range calculating section to derive a horizontal range on the one image over which the target object to be detected is photographed on the basis of the horizontal positions of the respective image split regions corresponding to any positions in the table at each of which the result of voting by the voting section indicates the value higher than the first threshold value; a vertical range calculating section to derive first histograms on horizontal edges for each vertical coordinate value on the whole image within the horizontal range derived by the horizontal range calculating section and to derive a vertical range at each end of which a histogram value of the first histograms indicates a value higher than a second threshold value; an edge selecting section to derive second histograms on vertical edge components for each horizontal coordinate value on the whole image within the vertical range derived by the vertical range calculating section, to search for longitudinal edges present on the one image determined from the second histograms, and to select any one of the longitudinal edges which is located in the vicinity of each end of the horizontal range derived by the horizontal range calculating section from among the searched longitudinal edges; and an actual position calculating section to derive an actual spatial position on each lateral end of the target object present in either the forward or rearward direction with respect to the camera setting position, each lateral end thereof corresponding to the selected longitudinal edge on the one image, on the basis of the distance from the camera setting position to the target object derived by the distance calculating section and the coordinate value of each longitudinal edge selected by the edge selecting section.