The present technology relates to an image processing apparatus and method, and a program, and particularly to an image processing apparatus and method, and a program, capable of reducing zipper noise.
Imaging apparatuses using an image sensor mainly includes a single-plate type apparatus (hereinafter, referred to as a single-plate type camera) using a single image sensor and a three-plate type apparatus camera (hereinafter, referred to as a three-plate type camera) using three image sensors.
In the three-plate type camera, for example, three image sensors for an R signal, a G signal, and a B signal, and three primary color signals are obtained using the three image sensors. In addition, a color image signal generated from the three primary color signals is recorded onto a recording medium.
In the single-plate type camera, a single image sensor is used in which a color coding filter formed by an array of a color filter assigned to each pixel is provided on a front surface, and a color component signal which is color-coded by the color coding filter is obtained for each pixel. As a color filter array forming the color coding filter, primary color filter arrays of red (R), green (G), and blue (B), or complementary color filter arrays of yellow (Ye), cyanogen (Cy), and magenta (Mg) are used. In addition, in the single-plate type camera, a single color component signal is obtained for each pixel using the image sensor, and color signals other than a color component signal of each pixel are generated through a linear interpolation process, thereby obtaining an image close to an image obtained by the three-plate type camera. In a video camera and the like, the single-plate type is employed in order to achieve miniaturization and light weight.
A color filter array with the Bayer array is frequently used as a color filter array forming the color coding filter. In the Bayer array, G filters are disposed in a checkered pattern, and R and B filters are alternately disposed for each column in the remaining parts.
In this case, the image sensor outputs only an image signal corresponding to a color of a filter, from each pixel where one color filter of three primary colors of R, G and B is disposed. In other words, an R component image signal is output from a pixel in which the R filter is disposed, but G component and B component image signals are not output therefrom. Similarly, only a G component image signal is output from a G pixel, and R component and B component image signals are not output therefrom. Only a B component image signal is output from a B pixel, and R component and B component image signals are not output therefrom.
However, when the signal of each pixel is processed in a subsequent stage of an image process, R component, G component, and B component image signals are necessary for every pixel. Therefore, in the related art, n×m image signals of the R pixels, n×m (where n and m are positive integers) image signals of the G pixels, and n×m image signals of the B pixels are obtained from an output of the image sensor formed by n×m pixels through interpolation operations, respectively, and are output to the subsequent stage.
A DLMMSE method is known for the related art (refer to DLMMSE algorithm from L. Zhang and X. Wu, “Color demosaicking via directional linear minimum mean square-error estimation,” IEEE Trans. on Image Processing, vol. 14, no. 12, pp. 2167 to 2178, 2005.)
In the DLMMSE method, first, with respect to an input image from an image sensor, a G component pixel signal is interpolated, and B component and R component pixel signals are interpolated using color differences (B-G and R-G) after the G component is interpolated. In addition, when the G component is interpolated, an interpolation value which produces a minimum square error in each of the vertical direction and horizontal direction of the input image is generated. Further, directionality in the vertical direction and horizontal direction is detected, and an interpolation value in the vertical direction and an interpolation value in the horizontal direction are apportioned on the basis of a detection result.