1, Field of the Invention
This invention relates to an automatic focus adjusting device having a plurality of distance measuring points and is used in a camera or the like.
2, Related Background Art
Many of the automatic focus adjusting systems for single-lens reflex cameras are such that the lens is focused on an object by repetitively effecting the cycles of "focus detection (sensor signal input and focus detection calculation) and lens driving". The amount of lens driving in each cycle is based on the defocus amount at a point of time whereat focus detection has been effected in that cycle, and this presumes that the defocus amount during focus detection will be eliminated at the end of lens driving.
As a matter of course, focus detection and lens driving require a certain amount of time, but in the case of a stationary object, the defocus amount does not vary as long as the lens is not driven. Therefore, the defocus amount to be eliminated at a point of time whereat lens driving has been completed is equal to the defocus amount at a point of time whereat focus detection has been effected and thus, correct focus adjustment is accomplished.
However, in the case of an object which is in motion, the defocus amount may vary during focus detection and lens driving, and therefore the be eliminated and the .detected defocus amount may differ remarkly from each other. This results in the problem that the lens is not in focus on the object at the end of lens driving.
Automatic focus adjusting methods directed to a solution to the above-noted problem are proposed in Japanese Laid-Open Patent Applications Nos. 62-125311, 62-139512 and 62-139511 and previously filed Japanese Patent Application No. 62-328233.
The gist of the method disclosed in the afore-mentioned Japanese Patent Application No. 62-328233 is to foresee the variation in defocus attributable to the movement of an object and apply a correction to the amount of lens driving (hereinafter referred to as the pursuit correction), in view of the detected defocus variation in each said cycle and the time interval between the cycles. From viewpoint of the focus accuracy at the end of lens driving, this method is expected to alleviate the above-noted problem.
However, when the aforementioned pursuit correction is actually made, there arises the following problem.
When an object is being pursued in the pursuit correction mode, if the object in the distance measuring field shifts to another object, the continuity of the change in the imaging plane position is lost. Therefore, when foreseeing is done on the basis of the data of the old object and the data of the new object, the foreseeing will be wrong with a result that the lens is driven to an entirely different location.
When the object in the distance measuring field thus shifts to another object incorrect foreseeing is performed and this that it problem that it is not eliminated as long as foreseeing control is effected by the use of the data of the old object.
The operation of the device shown in the aforementioned Japanese Patent Application will hereinafter be described with reference to the accompanying drawings.
FIG. 2 is a graph for illustrating the lens driving correction method shown in the aforementioned Japanese Patent Application. In FIG. 2, the horizontal axis represents time t, and the vertical axis represents the imaging plane position x of an object.
The curve x(t) indicated by a solid line represents the imaging plane position, at time t, of an object which comes close to the camera in the direction of the optic axis when the photo-taking lens is at infinity. The curve l(t) indicated by a broken line represents the position of the photo-taking lens at time t, and the lens becomes in focus when x(t) and l(t) coincide with each other. [ti, ti' ] represents the focus detecting operation, and [ti', ti+1] represents the lens driving operation. In the example of the prior art shown in FIG. 2, it is assumed that the imaging plane position changes in accordance with a quadratic function. That is, if the current and past three imaging plane positions (t.sub.1, x.sub.1), (t.sub.2, x.sub.2) and (t.sub.3, x.sub.3) are known at time t.sub.3, the imaging plane position x.sub.4 at time t.sub.4 after TL (AF time-lag+release time-lag) from time t.sub.3 can be foreseen on the basis of the equation x(t)=at.sup.2 +bt+c.
However, what can actually be detected by the camera are not the imaging plane positions x.sub.1, x.sub.2 and x.sub.3, but the defocus amounts DF.sub.1, DF.sub.2, DF.sub.3 and the amounts of lens driving DL.sub.1 and DL.sub.2 as converted into amounts of movement of the imaging plane. In addition, the time t.sub.4 is a value in the future and actually, it is a value which varies as the accumulation time of an accumulation type sensor is varied by the illuminance of the object, but here it is assumed as follows for simplicity: EQU t.sub.4 -t.sub.3 =TL=TM.sub.2 +(release time-lag) (1)
Under the above-mentioned assumption, the amount of lens driving DL.sub.3 calculated from the result of the focus detection at the time t.sub.3 can be found as follows: EQU x(t)=at.sup.2 +bt+c (2)
If (t.sub.1, l.sub.1) in FIG. 2 is regarded as the origin, ##EQU1##
By substituting the equations (3), (4) and (5) into the equation (2), a, b and c are found as follows: ##EQU2##
Consequently, the amount of lens driving DL.sub.3 as converted into the amount of movement of the imaging plane at the time t.sub.4 is found as follows: ##EQU3##
A problem arising when the object in the distance measuring field shifts to another object will now be described with reference to FIG. 3.
FIG. 3 shows the relation between time and the imaging plane position, and in this figure, the solid line represents the imaging plane position of a first object, and the dot-and-dash line represents the imaging plane position of a second object.
Here, let it be assumed that at times t.sub.1 and t.sub.2, focus detection is effected for the first object and the lens is driven and at time t.sub.3, focus detection is effected for the second object.
Thereupon, on the camera side, the imaging plane positions x.sub.1, x.sub.2 and x.sub.3 ' at the times t.sub.1, t.sub.2 and t.sub.3, respectively, are calculated from the defocus amount and the amount of lens driving obtained by focus detection, and a quadratic function f(t) passing through (t.sub.1, x.sub.1), (t.sub.2, x.sub.2) and (t.sub.3, x.sub.3 ') is calculated, and the imaging plane position x.sub.4 " at time t.sub.4 is foreseen on the basis of this f(t).
However, the imaging plane position of the first object at the time t.sub.4 is x.sub.4 and the imaging plane position of the second object at the time t.sub.4 is x.sub.4 ', and x.sub.4 " obtained by foreseeing is a position differing from the imaging plane positions of both objects.
This is because to foresee the imaging plane position x.sub.4 of the first object, it is necessary to find a function passing through (t.sub.1, x.sub.1), (t.sub.2, x.sub.2) and (t.sub.3, x.sub.3), and to foresee the imaging plane position x.sub.4 ' of the second object, it is necessary to find a function passing through (t.sub.1, x.sub.1 '), (t.sub.2, x.sub.2 ') and (t.sub.3, x.sub.3 ').
On the camera side, however, the first object and the second object cannot be distinguished from each other and therefore, foreseeing calculation is effected by the use of the defocus amount obtained at the time t.sub.3 by focus detection. As a result, the foreseeing function is neither an approximate function of the imaging plane position of the first object nor an approximate function of the imaging plane position of the second object, and the foreseen lens driving position is wrong. This is a problem which arises whenever the photographer changes over the main object to the second object while pursuing the first object because the wrong foreseeing as described above takes place when the data, of the focus detection effected for the object other than the main object exists in the data used for foreseeing.
A countermeasure for such a problem is disclosed in previously filed Japanese Patent Application No. 62-328233 or Japanese Laid-Open Patent Application No. 62-139511 or Japanese Laid-Open Patent Application No. 62-139512. The gist of the techniques disclosed in these applications is to immediately discontinue the pursuit mode when a condition unsuitable for foreseeing occurs, such as the loss of the continuity of the change in the imaging plane position or low illuminance of the object.
However, the prior-art system has suffered from the problem that when, contrary to the photographer's intention, the other object has been distance-measured due to camera shake or the interruption of the other object, foreseeing calculation is inhibited by this erroneously distance-measured data and foreseeing control cannot be resumed until the accumulation of predetermined data is computed, whereby the shutter chance is missed.