1. Field of the Invention
The present invention relates to an image capturing apparatus and a method of controlling the same.
2. Description of the Related Art
As a focus detection method performed by an image capturing apparatus, an on-imaging surface phase difference method is used in which focus detection by a phase difference method is performed using focus detection pixels formed in an image sensor.
U.S. Pat. No. 4,410,804 discloses an image capturing apparatus using a two-dimensional image sensor in which one microlens and a plurality of photoelectric conversion units are formed in each pixel. The plurality of photoelectric conversion units are configured to receive light components that have passed through different regions of the exit pupil of an imaging lens via one microlens, thereby dividing the pupil. A correlation amount is calculated from focus detection signals output from pixels (focus detection pixels) each including a plurality of photoelectric conversion units, and an image shift amount is obtained from the correlation amount, thereby performing focus detection by the phase difference method. Japanese Patent Laid-Open No. 2001-083407 discloses generating an image signal by adding focus detection signals output from a plurality of photoelectric conversion units for each pixel.
Japanese Patent Laid-Open No. 2000-156823 discloses an image capturing apparatus in which pairs of focus detection pixels are partially arranged in a two-dimensional image sensor formed from a plurality of imaging pixels. The pairs of focus detection pixels are configured to receive light components from different regions of the exit pupil of an imaging lens via a light shielding layer having openings, thereby dividing the pupil. An image signal is acquired by imaging pixels arranged on most part of the two-dimensional image sensor. A correlation amount is calculated from focus detection signals of the partially arranged focus detection pixels, and an image shift amount is obtained from the correlation amount, thereby performing focus detection by the phase difference method, as disclosed.
In focus detection using the on-imaging surface phase difference method, the defocus direction and the defocus amount can simultaneously be detected by focus detection signals formed in an image sensor. It is therefore possible to perform focus control at a high speed.
To improve the low luminance limit of focus detection, a method has been proposed which adds a plurality of correlation amounts calculated from the focus detection signals of a plurality of focus detection lines in a focus detection area, and obtains an image shift amount from the added correlation amount (correlation amount after the addition) with little noise. In this method, an evaluation value used to judge the reliability of the image shift amount, and the like are also calculated from the added correlation amount.
The plurality of focus detection lines in the focus detection area may include a line unsuitable for correlation amount detection, such as a saturated line in which the ratio of saturated signals is high or a defective line including a pixel having some defect. When performing focus detection by calculating the added correlation amount of a plurality of focus detection lines included in a focus detection area, the correlation amount of such a line needs to be excluded from addition processing for the added correlation amount. For this reason, the number of correlation amounts to be added to obtain the added correlation amount is not constant but changes. Along with this change, the added correlation amount, the evaluation value calculated from the added correlation amount to judge reliability, and the like also change. Hence, the added correlation amount needs to be normalized.
However, when processing is performed by a computing circuit such as an FPGA, normalization by division of the added correlation amount for each shift amount requires a large computing circuit scale and computing load as compared to addition, multiplication, bit shift operation, and the like.