Conventionally, from among this kind of monocular three-dimensional imaging device, one having an optical system illustrated in FIG. 9 is known (Patent Literature 1).
In this optical system, subject images passing through regions of a main lens 1 and relay lens 2 different in right and left directions undergo pupil division using a mirror 4, and form images on image sensors 7 and 8 via imaging lenses 5 and 6.
Portions (A) to (C) in FIG. 10 are diagrams illustrating separation of images imaged on the image sensors depending on differences at front focus, at focus (best focus) and at rear focus, respectively. In FIG. 10, for the purpose of comparison of differences of the separation depending on the focuses, the mirror 4 illustrated in FIG. 9 is omitted.
As illustrated in portion (B) in FIG. 10, images at focus out of images having undergone pupil division are formed at the same positions on the image sensors (coincident), whereas, as illustrated in portions (A) and (C) in FIG. 10, images at front focus and at rear focus are formed at different positions on the image sensors (separated).
Accordingly, by acquiring the subject images having undergone pupil division in the right and left directions via the image sensors 7 and 8, a left viewpoint image and a right viewpoint image different in viewpoint according to the subject distance (3D image) can be acquired.
There is proposed an imaging device capable of image capturing deep in depth of field and the like by regularly dispersing light flux using a phase plate and restoring due to a digital processing (Non-Patent Literature 1).
In a typical imaging optical system, a light ray concentrates most densely at the best focus position, and a blur diameter becomes wider approximately proportional to the defocus amount as separating from the best focus position. The shape of the blur is represented by a point spread function (PSF) (Patent Literature 2).
It is known that, when imaging an object body as a subject on an imaging plane of an image sensor via an optical system such as a zoom lens, the image captured by the image sensor has degraded image quality compared with the original object body due to the influence of aberration of the optical system causing blur. Image intensity distribution g of the image at this stage is indicated byg=f*h+n . . .   (A)where * denotes convolution integral, and convolution of luminance distribution f of the original object and point image intensity distribution h representing imaging performance of the optical system is added with noise n. When it is supposed that g, h and n are known, the luminance distribution f of the original object body can be calculated by equation (A). The technology for obtaining an ideal image by removing the blur of the optical system by the signal processing is called “restoration”, “inverse convolution” or “deconvolution” of the image. A restoration filter based on the point image intensity distribution (PSF) is generated in consideration of information regarding degradation of the image in image capturing such as image capturing conditions (such as exposure time, amount of exposure, a distance to the subject and a focal distance) and characteristic information of the imaging device (such as optical characteristics of the lens and identification information of the imaging device), for example (Patent Literature 3).
A degradation model due to the blur can be expressed by a function. For example, Normal distribution with a parameter of a distance from the center pixel (image height) can express blur phenomenon (Patent Literature 4).
In addition, an optical transfer function (OTF) is two-dimensional Fourier transform to a frequency domain of PSF. Since conversion of PSF to OTF and vice versa thereof is easy, OTF can be regarded identical to PSF (Patent Literature 5).
A special optical system in which a defocus amount is unknown and PSF spreads similarly either at focus or out of focus is called EDof (Extended Depth of Field) (Patent Literatures 6 to 8). FIG. 11 illustrates one example of a deconvolution filter corresponding to PSF of EDof. This deconvolution filter is a filter corresponding to an image height (distance from the image center) at each position in the image. In FIG. 11, it is illustrated that an image height of 0 to r1, image height of r1 to r2 and image height r2 or more correspond to a filter a, filter b and filter c, respectively.
Patent Literature 9 describes one example of a viewpoint image input device.
Patent Literature 10 describes one example of a technology for detecting the image displacement amount (defocus amount) by dividing an imaging screen and performing focus detection of the subject for each divided region. In addition, since the image displacement amount depends on the distance to the subject, the image displacement amount can be regarded identical to the subject distance technically.