Image sensors are sensitive to light in the visible spectrum. Naturally, however, the image sensors used in digital imaging are essentially black and white (light and dark) images. To capture color images, multiple band-pass color filters are imposed in front of the image sensor cells over the photosensitive areas of the cells. Color filters are typically pigmented or dyed material that will only pass a narrow band of visible light, e.g., red, blue, or green. For most low cost CMOS or CCD image sensors, the color filters are integrated with the sensor cells. A common example of a color filter pattern is the tiled color filter array illustrated in U.S. Pat. No. 3,971,065, and commonly referred to as “the Bayer pattern” color filter. The color filters allow what would otherwise be black and white image sensors to produce color images.
As shown in FIG. 1, the Bayer pattern 15 is an array of repeating red (R), green (G), and blue (B) filters. Half of the filters in the Bayer pattern 15 are green, while one quarter are red and the other quarter are blue. As shown, the pattern 15 repeats a row of alternating red and green color filters followed by a row of alternating blue and green filters. The Bayer patterned filters may be deposited on top of an array 20 of pixel sensor cells 22 in the manner shown in FIG. 2. Specifically, an array 20 of pixel sensor cells 22 is formed in a semiconductor substrate 10. Each of the pixel sensor cells 22 has a photosensitive element 12, which may be any photon-to-charge converting device, such as a photogate, photoconductor or photodiode. The color filter array 25 is typically formed over a metal layer 18 in the array 20, separated from the photosensor 12 by insulating layers like an interlevel dielectric layer (ILD) 14 and a passivation layer 16. The metal layer 18 may be opaque and used to shield the area of the pixels that is not light sensitive. Convex lenses 21 are formed over the color filters 25. Tin operation, incident light is focused by the lenses through the filters 25 to the photosensitive element 12.
The first step in color processing, when a conventional Bayer patterned color filter 15 is utilized, is typically called an interpolation. Since each pixel sensor cell is only producing a signal indicative of one color, interpolation from the neighboring pixel signals is used to produce a value for the other two colors for that pixel. For example, with reference to FIG. 1, the pixel sensor cell corresponding to the green filter 3, produces only a signal representing green light. In order to obtain an approximation of the amount of red and blue light for this pixel sensor cell, a value is interpolated from the neighboring red pixel cells (left and right) and the neighboring blue pixel cells (above and below), respectively. Further processing is then performed on the signals to correct the signals to make the produced image closer to the observed image as seen by the human eye.
As long as the color in the image that is being captured changes slowly relative to the filter pattern, color interpolation works well. However, for the edges of observed objects, or for very fine details, color may be interpolated incorrectly and artifacts can result. For example, a small white dot in a scene might illuminate only a single blue pixel. The white dot might come out blue if it is surrounded by black or some other color, depending on what comes out of the interpolation. This effect is called aliasing. One way to reduce aliasing is to use a blurring (or “anti-aliasing”) filter, which deliberately discards fine details. Defocussing the camera lens does almost the same thing. However, these are not always desirable alternatives.
As image sensors become increasingly smaller due to desired scaling, low light sensitivity becomes a very challenging problem. The conventional Bayer color filters are all narrow band-pass in nature, as none of them capture the whole visible spectrum. In low light conditions, this deficiency is even more apparent. One approach to increasing the sensitivity of a pixel sensor cell is to increase its photosensitive area. As customer demands increasingly require smaller devices, this approach is undesirable.
Accordingly, there is a need for a color filter array for use with image sensors that increases the low light sensitivity of the image sensor without increasing the sensing area of the image sensor. There is also a need for a method of processing the improved signals from such a filtered sensor.