1. Field of the Invention
The present invention relates to a solid-state imaging device, for example a MOS image sensor, a method of manufacturing the same, and a camera using such a solid-state imaging device.
2. Description of the Related Art
Solid-state imaging devices can be roughly classified into charge transfer solid-state imaging devices typified by CCD image sensors and amplified solid-state imaging devices typified by CMOS image sensors. In comparison with the CMOS image sensor, the CCD image sensor may need a power supply voltage higher than that of the CMOS image sensor, because the CCD image sensor may require a high driving voltage for the transfer of a signal electric charge.
In recent years, therefore, the CMOS image sensors have been used more than the CCD image sensors as solid-state imaging devices mounted on a mobile apparatus, such as a camera-incorporated mobile phone unit or a personal digital assistant (PDA) because the CMOS image sensors have advantages over the CCD image sensors in terms of lower power supply voltage, power consumption, and the like.
The solid-state imaging device used in any mobile apparatus or the like has a reduced area per pixel along with miniaturization and high-resolution. In addition, the area of a photodiode provided as a photoelectric conversion portion is reduced along with a decrease in area of the pixel. Thus, it may result in a decrease in number of electrons (holes) photoelectrically converted. A reduction in area of the pixel is a disadvantageous for the collection of light and leads to a decrease in sensitivity or SNR, or the like. Thus, for example, a solid-state imaging device for brightly taking an image of a dark subject or the like has been known in the art. This solid-state imaging device includes a unit pixel matrix of a color pixel having an optical inner filter layer for blocking infrared light (IR cut filter) and a pixel on which infrared light is incident (Japanese Unexamined Patent Application Publication No. 2006-190958).
The incidence of infrared light on a typical color pixel results in insufficient color reproducibility. Therefore, a method for improving the sensitivity and SNR of pixel with the infrared light blocks the incidence of infrared light on color pixels using the above optical inner filter layer (IR cut filter) such as a plurality of dielectric layers that use refractive-index differences.
FIG. 1 illustrates a CMOS image sensor including a filter layer for blocking infrared light. FIG. 2 illustrates a CMOS image sensor without a filter layer for blocking infrared light. In each of FIGS. 1 and 2, a pixel is schematically represented only by a photodiode (PD) while a pixel transistor is omitted for making the configuration of the CMOS image sensor clearly understandable.
A CMOS image sensor 1 as illustrated in FIG. 2 includes an imaging area on the principal surface of a semiconductor substrate 2. The imaging area is formed of a plurality of pixels provided in a two-dimensional array. Each of the pixels has a photodiode (PD) 3 as a photoelectric conversion portion and a plurality of pixel transistors (MOS transistors, not shown). A plurality of wiring layers 6 with a plurality of layered lines 5 through an insulating interlayer 4 are formed on the principal surface of the pixel-formed semiconductor substrate 2. Furthermore, a color filter 7 and an on-chip microlens 8 are formed above the plurality of wiring layers 6 through a planarizing layer (not shown).
A CMOS image sensor 11 as illustrated in FIG. 1 includes an imaging area formed of a plurality of pixels provided in a two-dimensional array. Each of the pixels has a photodiode (PD) 3 as a photoelectric conversion portion and a plurality of pixel transistors (MOS transistors, not shown) on the principal surface of a semiconductor substrate 2. A plurality of wiring layers 6 with a plurality of layered lines 5 through an insulating interlayer 4 are formed on the principal surface of the pixel-formed semiconductor substrate 2. Furthermore, an optical inner filter layer (IR cut filter layer) 12 is formed above the plurality of wiring layers 6 for a pixel on which the incidence of infrared light may need to be blocked. In other words, the optical inner filter layer 12 is formed above each of the pixels of red (R), green (G), and blue (B) but no optical inner filter layer 12 is formed above one pixel (that is, IR pixel). A buried layer 13 is formed in an area where no optical inner filter layer 12 is formed and a color filter 7 and an on-chip microlens 8 are then formed with a planarizing layer 14 in between. Here, a unit pixel matrix includes four pixels, that is, the R, G, and B pixels and the IR pixel. The color filter for the IR pixel is formed of a filter transmitting visible light and infrared light. Furthermore, the optical inner filter layer 12 formed of a plurality of dielectric layers may need to be arranged below the on-chip microlens 8, the color filter 7, and the planarizing layer 14 made of an organic film or the like because of restricted film-forming temperature.
In the CMOS image sensor 11, the IR pixel positively uses infrared light. Thus, for example, the sensitivity, SNR, and the like of the CMOS image sensor 11 can be improved to allow a user to take a bright image with suitable color tone, for example, when the user wishes to take a bright image of a dark subject.
On the other hand, for increasing the efficiency of light focusing on the photodiode (PD) of each pixel, a solid-state imaging device having an inner-layer lens has been known in the art (see, Japanese Unexamined Patent Application Publication No. Heisei 11-103037).
Here, image sensors are devices used in a wide range of applications including cameras and video cameras.
The image sensors which can be used in such devices are formed of many pixels. The efficiency of the entire device can be determined by the dimensions and structure of each pixel.
Any device for multiple colors includes different pixels of three colors, red (R), green (G), and blue (B), by providing the pixels with the respective absorbing color filter materials in solid state originated from organic materials.
In general, a lens having the same size as that of the pixel is used for improving the collection of light within the pixel. In other words, the lens is provided to retain incident light so that the light comes into a focus on the light-receiving section of the pixel. Then, the light incident on the pixel is captured by the pixel as much as possible to prevent the light from scattering into the surrounding pixels.
Recently, several investigators have proposed providing open pixels for receiving the whole visible light along color pixels (see, for example, Japanese Unexamined Patent Application Publication No. Heisei 6-205178 and U.S. Pat. No. 6,211,521). Such open pixels may receive light the wavelength of which is over visible light or includes infrared light.
The use of open pixels joined to the respective color pixels makes possible the generation of a color image with an extremely low level of light intensity and may improve the response of the device.
An IR cut filter has been mounted on the outside of the device to improve color performance.
However, in the device using the open pixels designed as described above, the use of IR cut filter is limited to a color pixel with a need of blocking infrared light. Thus, there is a need of manufacturing an on-chip IR cut filter so that a pixel with a filter or without a filter can easily be provided.
The use of on-chip filter is not a newly proposed idea but difficult in actual designing.
On-chip filters typically used are mainly formed by alternately stacking materials having different refractive indexes, thereby adjusting the required spectrum of light generated (see, for example, Inaba et al., IEEE Electron Device Lett., 27, 6, 2006, pp. 457).
An etching technology is used to independently form such on-chip filters on a pixel that may require such a filter. Furthermore, in principle at the present, it is possible to form an on-chip filter that separates light at any optical bandwidth of visible and infrared light. Therefore, it means that a color image sensor having pixels for infrared or white light is of practical use.
A plurality of layers are usually stacked in such filters to have a planar structure so that uniformity and control over layer thickness can be improved (see, for example, Japanese Unexamined Patent Application Publications No. 2005-174967 and No. 2006-351801).