To focus an auto-focusing camera or the like on an object by a so-called passive method, a distance to an object needs to be detected by external light triangulation distance measuring method in the case of a lens-shutter camera, or, in the case of a TTL camera, a degree of deviation from its focused condition needs to be detected by use of internal light passing through an image pick-up lens, as known in the art. In either case, a pair of images of the object are usually formed on a pair of optical sensor arrays, through mutually different optical paths, by means of an optical system including lenses, to electrically detect a difference in the relative positions of the pair of images. While this difference is defined on a position coordinate, with respect to reference positions of the optical sensor arrays which correspond to the optical axes of the optical system, this difference between the images may be defined on a time coordinate, with respect to reference phases corresponding to the reference positions, and therefore may be usually called a phase difference.
This phase difference has been detected by a conventional method including the steps of: 1) preparing a sensor data group representative of a pattern of light-intensity distribution of the image, which consists of sensor data received from a plurality of optical sensors in each optical sensor array; 2) preparing a plurality of subgroups each consisting of a fixed number of a series of sensor data, from each of the left and right sensor data groups, such that a portion of the sensor data group from which the subgroup is picked up is sequentially shifted; and 3) preparing a plurality of combinations of two subgroups corresponding to the left and right optical sensor arrays.
In the next step, a degree of correlation between the subgroups is observed with respect to each combination, to find a combination having the maximum correlation. Although each subgroup represents only a part of the image pattern of the object, the maximum correlation between the two subgroups selected from the left and right sensor data groups means that the parts of the left and right image patterns represented by the subgroups substantially coincide with each other. This makes it possible to know how much the left and right images should be shifted from the portions of the sensor data groups from which the two subgroups of the combination having the maximum correlation are selected, so as to coincide the image patterns with each other. Namely, a difference between the relative positions of the pair of images can be detected.
In actual application, the plurality of combinations of the subgroups are numbered in a predetermined order, and the phase difference as a difference between the pair of images is easily calculated by adding or subtracting constants determined by the positional relationship between the optical sensor arrays and the optical system for forming images, for example, to or from the number of the combination that is determined to have the maximum correlation. While the phase difference thus detected is an integer as it is, with the array pitch of the optical sensors in each optical sensor array used as a unit, an estimated value may be calculated with respect to each combination, as an index representing the degree of correlation between the subgroups in the combination, and the phase difference is usually detected with the accuracy of about two decimal places, by interpolation using the estimated values of the combinations before and after the one having the maximum correlation.
In the conventional phase difference detecting method as described above, the phase difference is detected by preparing the left and right sensor data groups representative of the patterns of the pair of images formed on the left and right optical sensor arrays, sequentially shifting the portions of these sensor data groups from which the subgroups are picked up, and determining the combination having the maximum correlation between the left and right subgroups. This method is principally based on the assumption that the maximum correlation is detected when the image patterns represented by the left and right subgroups coincide with each other. If the left and right images of the object differ in brightness or pattern for some reason, the above assumption is not accurately realized, with a result of a reduction in the accuracy with which the phase difference is detected.
The above problem may occur when a kind of stray light called flare intrudes into or enters one of the left and right optical sensor arrays, as they catch an image of an object in backlight caused by the sun or bright illumination such as neon signs. This flare corresponds to fog in photography. The above one optical sensor array receives a much brighter image due to the extraordinary intense light than the other optical sensor array, and provides a pattern of light intensity distribution of this image which is considerably different from that provided by the other sensor array.
FIG. 6 shows the result of observation of errors .tangle-solidup..sigma. in detected values of phase differences .sigma., when the left and right images of an object differed in brightness or pattern, due to intrusion of stray light. The object used in this test had a simple pattern in which the brightness was different between the left half and the right half thereof, and the contrast between the left and right halves was varied in five steps or degrees. The axis of abscissas in the graph of FIG. 6 indicates a so-called EV (Exposure Value) representing this contrast, and, as well known in the art, a difference of 1EV means that the brightness of one half is twice as much as that of the other half. Although the error .tangle-solidup..sigma. in the phase difference .sigma. was reduced with an increase in the EV, the error .tangle-solidup..sigma. still exceeded 0.5, which is an ordinary permissible upper limit, even when the EV is 4, that is, the contrast between the left and right halves of the object was as much as 64 (i.e. the brightness of one of the left and right halves of the object is 64 times as much as that of the other half).
In the light of the above problems, it is an object of the present invention to provide a method of detecting a phase difference with improved accuracy, even if left and right images received by a pair of optical sensor arrays differ in brightness or pattern thereof, due to intrusion of stray light.