1. Field of the Invention
The present invention relates generally to a color image sensor and, more particularly, to an integrated color pixel (ICP) and a method of integrating control of wavelength responsivity into the ICP itself during a standard integrated circuit (IC) design and fabrication process.
2. Description of the Related Art
A basic image sensor captures an image in gray scale. That is, the image sensor, with its monochrome pixels, generally records the image in a series of tones ranging from pure white to pure black. A color image sensor captures the image in colors with monochrome pixels and color filters. The color filters are utilized for controlling the wavelength responsivity of the individual monochrome pixels. The wavelength transmittance of the color filters, along with the other optical elements, determine the wavelength responsivity of the color channels of the color image sensor.
A conventional way to integrate the color filters and hence the wavelength responsivity into the color image sensor is to utilize a color filter process in which a color filter array (CFA) comprising a regular array of differently colored filters is overlaid or deposited onto the monochrome pixels to separate out the colors of the light reflected from an image.
Color filter process in general and the CFA technology in particular are well known in the art and thus are not further described herein. An exemplary teaching on color filter process for semiconductor array imaging devices can be found in U.S. Pat. No. 6,274,917, which is by Fan et al. and assigned to Semiconductor Manufacturing Company (TSMC), Hsin-Chu, Taiwan.
The color filter pattern or patterned CFA of each color image sensor can be specifically designed to yield sharper images, truer colors, and/or any color effects desired. Most modern semiconductor image sensors, including complementary metal-oxide semiconductor (CMOS) and charge-coupled device (CCD) image sensors, utilize red, green, or blue (RGB) filters. Some image sensors in cameras use cyan, magenta, and yellow (CMYK) filters. The color patterns also vary from company to company with the most popular pattern being the Bayer mosaic pattern, taught by Bayer in U.S. Pat. No. 3,971,065 and assigned to Eastman Kodak Company, Rochester, N.Y.
Bayer teaches a sensing array for color imaging having individual luminance- and chrominance-sensitive elements that are so intermixed that each type of element, i.e., according to sensitivity characteristics, occurs in a repeated pattern with luminance elements dominating the array. In this case, filters selectively transmissive to light in the green region of the spectrum are utilized in producing luminance-sensitive elements, and red and blue transmitting filters are used for producing a first and second chrominance-sensitive elements. In other words, in Bayer, selectively sensitized elements, i.e., individual monochrome pixels with corresponding color filters overlaid thereupon, cooperating in interlaid sampling patterns control wavelength responsivity of the color imaging array. An exemplary Bayer mosaic pattern 100 is shown in FIG. 1(a). An exemplary application of the Bayer mosaic pattern 100 utilized in a conventional color pixel array 110 is shown in FIG. 1(b) where monochrome pixels 30 are overlaid with a CFA 34. As shown in FIG. 1(b), the CFA 34 is sandwiched between layers 33 and 35. Layers 33 and 35 can be specified as an anti-reflection layer and an insulation/protection layer, respectively. Layer 32 may comprise a variety of layers including additional metal layers for a variety of application-specific purposes.
As is well known in the art, one common problem associated with the use of CFA is that pixel and therefore the sensor sensitivity varies with the specific type of color. Several implementations have been proposed to enhance the pixel sensitivity. One such example can be found in U.S. Pat. No. 6,057,586, titled “Method and apparatus for employing a light shield to modulate pixel color responsivity” by Bawolek et al. and assigned to Intel Corporation of Santa Clara, Calif.
In Bawolek et al., a light shield is utilized to modulate/modify pixel sensitivity. According Bawolet et al., a pixel cell refers to a light sensing circuit and a CFA overlaid on top of a light sensor. The light sensor, as commonly understood, can be any means that receives and converts incident light into an electrical signal representative of that light. The pixel cell having a color filter array material of a first color disposed above the light sensor has a first relative sensitivity. A modulating light shield is disposed above the light sensor to modulate the pixel sensitivity. The light shield forms an aperture whose area is substantially equal to a light receiving area of the light sensor adjusted by a reduction factor. The reduction factor is the result of an arithmetic operation between the first relative sensitivity and a second relative sensitivity, associated with a second pixel of a second color. In Bawolet et al., the light shield, constructed using one of the pixel metal layers, and the CFA overlaid thereupon control sensitivity and wavelength responsivity of the pixel, respectively. Bawolet et al. particularly note that it is important that the other metal layers do not intrude into the non-covered optical path specified by the light shield layer. FIG. 2(a) shows a conventional color pixel array 200 having monochrome pixels 201 overlaid with a CFA 210 having a typical Bayer RGB pattern. FIG. 2(b) shows a modified color pixel array 220 comprising the monochrome pixels 201 overlaid with a light shield metal layer 202 and the CFA 210. The light shield metal layer 202 has a plurality of openings 203 where the area of the openings is specifically configured based on the color responsivity, i.e., red, green, or blue, of the pixel cell. The openings employed for controlling the pixel sensitivity can range from approximately one micron by one micron to 5 microns by 5 microns.
The various pixel implementations including the aforementioned pixel sensitivity modulation layer are to improve mostly CCD technology-based or CMOS technology-based image sensors. CCD image sensors have drawbacks well known in the art including limited on-chip signal processing capability and high power consumption. CMOS image sensors have been developed to overcome these drawbacks. The advantages of CMOS image sensors over CCD image sensors are well known in the art and thus are not described herein. CMOS image sensors utilizing CFAs, however, still have the following weaknesses.
A typical CFA may include multiple layers of superimposed color filters, which often cause a color “cross-talk” problem. Also, because the height of a pixel inversely affects the pixel's efficiency, the added height of these layers may cause a decrease in pixel efficiency.
What is more, as discussed heretofore, a typical CMOS color image sensor generally comprises a basic “monochrome” image sensor with color filters overlaid thereupon. The hybrid nature of CMOS image sensors and the color filter process lead to high fabrication costs and inflexibility in color image sensor design flow. Utilizing thin film technology, Böhm et al. proposed in “High Dynamic Range Image Sensors in Thin Film on ASIC Technology for Automotive Applications” Thin-Film-on-ASIC (TFA) image sensors to overcome the weaknesses of conventional CMOS color image sensors and to further reduce color image sensor manufacturing costs.
According to Böhm et al., a TFA image sensor is a vertically integrated image sensor where the optical detector of the sensor is deposited onto an ASIC wafer by a plasma enhanced chemical vapor deposition (PECVD) process in a cluster deposition system. ASIC wafer itself is fabricated using a standard 0.7 μm CMOS process.
Thin film technology provides a possibility of retaining the entire usual design flow and fabrication steps as employed for ASICs. As shown in FIG. 3(a), a typical TFA image sensor 300 consists of an amorphous silicon (a-Si:H) based optical detector 310 being deposited on top of a crystalline silicon (x-Si) ASIC 301 in a low temperature PECVD process. a-Si:H is an excellent material for optical detectors but not for electronic circuits. On the other hand, x-Si has poor photoelectric properties but a wide range of highly developed and abundantly available technologies for fabricating powerful integrated circuits.
FIG. 3(a) also shows the layer sequence of the typical TFA image sensor 300 where an insulation layer 320 separates the optical detector 310 from the ASIC 301. The insulation layer 320 is patterned in order to provide contact holes between the optical detector 310 and the circuitry of ASIC 301. Generally, there is one hole per pixel. The ASIC 301 typically includes identical pixel circuitry underneath each detector and peripheral circuitry outside the light sensitive area. The thin film system 340 is sandwiched between a metal rear electrode 330, which is usually the third metal layer of the ASIC 301, and a transparent front electrode 350. Due to its higher absorption coefficient and its maximum spectral response for green light, an amorphous silicon detector is better qualified for an image sensor than a crystalline silicon detector. This thin film deposition sequence is adaptable to the specific requirement of an application. For example, the a-Si:H thin film system 340 may comprise multiple layers. According to Böhm et al., at a thickness of less than 1 μm, the thin film system consists of four to seven successively deposited layers.
FIG. 3(b) illustrates how the TFA image sensor 300 is developed and fabricated. As shown in FIG. 3(b), the ASIC 301 and the optical detector 310 are separately developed in TFA image sensor design and fabrication process 311 so that the photoelectric properties of the detector can be optimized independent of the ASIC process. The ASIC 301 wafer is manufactured and the optical detector 310 including the a-Si:H thin film system 340 is subsequently deposited upon the ASIC 301. TFA image sensors are suited for automotive vision systems and are in development by Silicon Vision in cooperation with the Institute for Semiconductor Electronics (IHE) at the University of Siegen, both of Germany.
According to Böhm et al., one of the major benefits of the TFA technology is the possibility to deposit thin film detectors with adjustable spectral sensitivity on top of the ASIC. For a color sensor array, this leads to a 3-colors-in-one-pixel sensor design. The spectral response can be shifted varying the voltage applied to the pixel. FIG. 4(a) shows a pixel block diagram of the TFA color sensor array 400. The detector circuit 410 keeps the rear electrode at constant potential in order to suppress lateral balance currents. The photocurrent is fed into one of the color circuits 420, 430, 440, one at a time, during the integration phase. The TFA color sensor array 400 generates and stores the complete RGB information inside each pixel without intermediate readout operation. For readout, the integrated color voltages are sequentially applied to the column output line 450.
Wavelength responsivity of a TFA color pixel is controlled by the thin film being deposited thereupon a standard ASIC wafer. That is, the essential ability of controlling wavelength responsivity of a color image sensor is still dependent upon a separate and independent thin film process applied subsequent to standard semiconductor processing steps. What is more, pixels in the vertically integrated TFA image sensor having multiple layers deposited thereon may suffer from reduced pixel efficiency because of the increased overall height of the pixels.
What is needed, is a method of integrating control of wavelength responsivity into the pixel itself, generating a complete and highly integrated color pixel for quality color imaging using entirely standard semiconductor IC design and fabrication process, thereby solving a problem of color cross-talk in image sensors utilizing superimposed color filters and eliminating a need to subsequently applying a separate color filter process such as the thin film process, hence simplifying color image sensor design flow, further reducing color image sensor manufacturing cost, and potentially increasing pixel efficiency.