A solid-state image sensing device adapted to detect the distance to an object by a phase difference method has been proposed for digital cameras and digital video cameras, where pixels having a distance-measuring function (hereinafter referred to as distance-measuring pixels) are used as all or part of pixels in the image sensing device (see PTL 1). The distance-measuring pixel includes plural photoelectric conversion sections such that luminous fluxes passing through different areas on a pupil of a camera lens will be led to the respective photoelectric conversion sections. Optical images are generated by the luminous fluxes passing through the different pupil areas, where the optical images are referred to respectively as an “A-image” and “B-image,” which are collectively referred to as “AB-images”). The optical images are acquired based on signals obtained by the photoelectric conversion sections contained in each distance-measuring pixel. An amount of a gap as a relative position change of the AB-images (hereinafter also referred to as an “image gap amount”) is detected. A distance to the object can be calculated by converting the image gap amount into an amount of defocus. Defocus is a state in which an imaging plane and image sensing plane of a taking lens do not match each other and the imaging plane of the taking lens is shifted in a direction of an optical axis, where the amount of shift is the amount of defocus. This eliminates the need to move the lens in measuring the distance unlike a conventional contrast method, and thereby enables high-speed, high-accuracy distance measurement.
A depth-from-defocus (DFD) method has been proposed as another method capable of acquiring distance-to-object information (see PTL 2). In the DFD method, two images are acquired with different shooting conditions (focal length, aperture value, and the like) of a taking lens mounted on a digital camera or the like to calculate a blurring correlation value between the images for each pixel. The distance to the object can be calculated for each pixel by referring to a reference look-up table, which defines a relationship between the blurring correlation value and object distance.