Automatic focusing systems are widely used in both still and motion picture cameras. Such systems reduce the expertise required of the user. In addition, in motion picture cameras, the time to change the focus as the scene evolves is prohibitive if the distance between the camera and the object of interest is changing rapidly.
In one prior art system, the computer that controls the lens searches for the focal position that maximizes the high spatial frequency content of the image. Since an out-of-focus image is blurred, the spatial frequency spectrum associated with images of scenes that contain sharp edges and other high spatial frequency generating elements has less power in the high frequency portion of the spectrum than an image of the scene when in focus. Accordingly, these schemes iteratively search the focal distance for the focus that generates the image having the highest ratio of high spatial frequency energy to average spatial frequency energy. The time to perform the search presents challenges when this algorithm is applied to a rapidly changing scene that is being captured by a motion picture camera.
A second class of prior art autofocus systems that avoids this search time utilizes a measurement of the phase difference between pixels that view the image through different portions of the camera lens. These schemes utilize a dedicated imaging array that is separate from the imaging array that generates the photograph or special pixel sensors in the array to sense this phase difference. These special autofocus pixels replace the conventional pixels that record the image; hence, the image recorded by the array includes “holes” at the locations corresponding to the autofocus pixels. These holes are filled by interpolating the results from the surrounding pixels.