In image pickup apparatuses, such as digital cameras and digital video cameras, light from a subject enters a sensor having a plurality of elements, such as a CCD or CMOS, via an image pickup optical system including a lens or the like. The light that has passed through the image pickup optical system is converted into an electric signal in the sensor. By performing processes necessary for forming an image, such as an A/D conversion process and a demosaic process, on the electric signal, a captured image can be obtained.
The quality of such a captured image is affected by the image pickup optical system. Generally, when a high-performance lens is used, a sharp image with a low degree of blurring can be obtained. In contrast, a captured image that is obtained using a low-performance lens is blurred. For example, in the case of capturing an image of a starry sky, individual stars are seen as sharp dots if the image is captured using a lens that causes a low degree of blurring. In contrast, if the image is captured using a lens that causes a high degree of blurring, individual stars are blurred and expanded and are not seen as dots.
Hereinafter, a description will be given of an image processing method for correcting a blur in a captured image that is caused by an image pickup optical system. In this method, a blur in a captured image is corrected on the basis of a point spread function (PSF). The PSF represents how a point of a subject blurs. For example, a two-dimensional distribution of light on a sensor surface in a case where an illuminant (point source) with a very small volume is captured in darkness corresponds to the PSF of the image pickup optical system.
The PSF is a point in an ideal image pickup optical system with a low degree of blurring. The PSF is not a small point and is expanded to some extent in an image pickup optical system with a high degree of blurring.
An example of a method for correcting a blur using data relating to the PSF includes a method using an inverse filter. Hereinafter, a method for forming an inverse filter will be described. A captured image obtained by using an ideal image pickup optical system that prevents the occurrence of blurring is represented by f(x, y). x and y are variables representing a two-dimensional position in a captured image, and f(x, y) represents a pixel value at the position x, y. On the other hand, a captured image obtained by using an image pickup optical system that causes blurring is represented by g(x, y). Also, the PSF of the foregoing image pickup optical system that causes blurring is represented by h(x, y). h(x, y) is determined by, for example, the characteristics of a lens, capture parameters (aperture, position of an object, zoom position, etc.), and the transmittance of color filters of a sensor. Also, h(x, y) may be determined by measuring the two-dimensional distribution of light on a sensor surface in a case where an image of a point source is captured. The following relationship is established among f(x, y), g(x, y), and h(x, y).g(x,y)=h(x,y)*f(x,y)  (1)
* represents convolution (convolution integral). Correcting a blur corresponds to estimating f(x, y), which is obtained by using an ideal image pickup optical system, from a captured image g(x, y) obtained by using an image pickup optical system that causes blurring and h(x, y), which is the PSF of the image pickup optical system.
If Fourier transform is performed on Equation 1 that is expressed in a real plane to transform it into a display form in a spatial frequency plane, the form of the product of respective frequencies is obtained, as in the following equation.G(u,v)=H(u,v)·F(u,v)  (2)
H(u, v) is obtained by performing Fourier transform on h(x, y), which is the PSF, and is called an optical transfer function (OTF). u and v represent the coordinates on a two-dimensional frequency plane, that is, a frequency. G(u, v) is obtained by performing Fourier transform on g(x, y) (Fourier display), and F(u, v) is obtained by performing Fourier transform on f (x, y).
In order to obtain an ideal unblurred image from a blurred captured image, both sides may be divided by H as follows.G(u,v)/H(u,v)=F(u,v)  (3)
By performing inverse Fourier transform on F(u, v) to recover a display form in the real plane, an ideal unblurred image f(x, y) can be obtained as a recovery image.
Here, assume that inverse Fourier transform is performed on the reciprocal of H in Equation 3 (H−1) to obtain R. Then, convolution with respect to an image in the real plane is performed as in the following equation, so that an unblurred image can be obtained similarly.g(x,y)*R(x,y)=f(x,y)  (4)
R(x, y) is called an inverse filter. Actually, a frequency (u, v) that causes H(u, v) to be 0 may exist. In the frequency that causes H(u, v) to be 0, division with zero occurs in Equation 3, and calculation is impossible to perform.
Normally, the value of OTF decreases as the frequency increases, and thus the reciprocal thereof, that is, the value of the inverse filter R(x, y), increases as the frequency is increases. Thus, if a convolution process is performed on a blurred captured image using the inverse filter, a high-frequency component of the captured image is emphasized. An actual captured image includes noise, and the noise typically has a high frequency, and thus the inverse filter may emphasize the noise.
In order to overcome the problem of not being able to perform calculation due to the occurrence of the above-described division with zero and not to excessively emphasize high-frequency noise, a Wiener filter, obtained by transforming the equation of the inverse filter R(x, y), has been suggested. Hereinafter, filters that are used for correcting a blur, such as the inverse filter and Wiener filter, will be referred to as image recovery filters.
In many image pickup apparatuses, such as digital cameras and digital video cameras, color filters of a plurality of specific colors are arranged in front of a sensor having a plurality of elements, such as a CCD and a CMOS, thereby obtaining color information. This method is referred to as a single-chip method. An example of a typical color filter array used for a single-chip digital camera or a single-chip digital video camera includes a Bayer array. In the case of a single-chip image pickup apparatus, a signal of another color cannot be obtained from an element corresponding to a color filter of a specific color. Thus, a signal of another color is obtained through interpolation using signals from neighboring elements. This interpolation process is referred to as a demosaic process (demosaicing process). Hereinafter, an image on which a demosaic process has not been performed is referred to as RAW data.