1. Technical Field
The present invention relates to an image sensor, more particularly to an apparatus and a method that can compensate color deviation occurred at corners of an image photographed by the image sensor.
2. Description of the Related Art
An image sensor refers to a semiconductor device converting an optical image to an electric signal. The image sensor includes a charge coupled device (CCD) and a complementary metal-oxide-silicon (CMOS) image sensor. The charge coupled device (CCD) has metal-oxide-silicon (MOS) capacitors disposed adjacent to each other and storing and transferring electric charge carriers. The complementary metal-oxide-silicon (CMOS) image sensor employs a CMOS technology, which uses a control circuit and a signal processing circuit as a peripheral circuit, to generate MOS transistors as many as the number of pixels and detect an output one by one from the MOS transistors.
Recently, portable devices having such an image sensor (e.g., digital cameras, mobile communication terminals, etc.) have been introduced in the market. The image sensor is composed of an array of photosensitive diodes called pixels or photosites. The pixel itself usually does not extract colors from light, but converts photons in a broad spectrum band into electrons.
In order to record color images by a single sensor, the sensor filters such that different pixels receive different colors. This type of sensor is known as a color filter array (CFA). Different color filters are arranged across the sensor according to a predefined pattern.
As a most common pattern, a Bayer pattern is widely employed in the CFA. In the Bayer pattern, a half of the total number of pixels is green (G), and each quarter of the total number is assigned to red (R) and blue (B). In order to obtain color information, red, green and blue filters are arranged in a particular sequence to form a repetitive pattern. The Bayer pattern is composed of a 2×2 array.
The Bayer pattern is based on the premise that the human eye derives most of the luminance data from the green light. Therefore, an image with a higher resolution can be generated when more of the pixels are made to be green, compared to the RGB color filter which alternates an equal number of red, green, and blue pixels.
The first element affecting directly on image quality of the image sensor is a lens, which focuses light onto the image sensor. The lens should focus the light exactly onto the image sensor, transmit a lager amount of photons equally on an image pickup device, and minimize difference of transmittance between components of the light to prevent color deviation between the colors.
But, portable apparatuses recently being developed and marketed are getting slimmer and more miniaturized which requires slimmer and smaller sensor modules. And, the demand for the image sensor having larger pixels in the portable apparatus is increased.
Accordingly, the lens cannot have enough distance with the image pickup device, and does not transmit the light effectively. And, the lens does not transmit the photons equally to the image pickup device.
Furthermore, as closer to the corners of the image pickup device, the amount of the light transmitted through the lens is reduced which causes significant color deviation so that an unexpected color is inserted into the overall image.
FIG. 1 is an example of color deviation occurred in each quadrant of a photographed image. When the image is divided into four quadrants I, II, III, IV as shown in FIG. 1, at least one or more of red, green and blue are strongly bulged out compared with the rest at the corners A, B, C, D of each quadrant which causes unevenness of the color deviation over the whole image.
Such an uneven color deviation further causes image distortion since not one color is bulged out in all 4 quadrants but a different color is bulged out at a different quadrant, for example red is bulged out at the A corner and blue at the C corner.
Even the beginning point, where the color deviation begins to become uneven, can be different in each quadrant. Since there are such differences in distances, RA for I, RB for II, RC for III, and RD for IV, between the center of the image and the beginning point at which the unevenness of the color deviation begins, in each quadrant, image distortion is also occurred.
The pixels in the central part and the pixels in the periphery of the CFA of the image sensor are exposed to a light source from different positions. These minute differences in position cause differences in illumination, and such differences in illumination affect color because of differences in light frequency and refractive index. Consequently, color distortion and reduction in signal amplitude dependant on the position of the pixels inevitably occur, degrading the quality of primitive images.
In order to overcome these problems has been introduced a method that equalizes the luminance in an image through compensating the lens shading phenomenon when photographing the image of a white area. But, the correction is processed centering on the center of the image and in a lump according to the distant from the center of the image.