1. Technical Field
The present invention relates to a solid state imaging device and a method of manufacturing the same and, more particularly, to a Complementary Metal Oxide Semiconductor (CMOS) image sensor and a method of manufacturing the same.
2. Description of the Related Art
Image sensors convert optical images into electrical signals. Image sensors can be classified into Charge Coupled Device (CCD) image sensors and CMOS image sensors. The CCD image sensor has a complicated driving system and a complicated manufacturing process. Signal processing circuits are difficult to fabricate in one CCD chip. In contrast, the CMOS image device can be fabricated by standard CMOS techniques. Recent research on image sensors has been focused on the CMOS image sensor.
The CMOS image sensor includes a plurality of unit pixels for photoelectrically converting incident light. FIG. 1A is a circuit diagram illustrating a unit pixel of a conventional CMOS image sensor. Referring to FIG. 1A, the unit pixel includes a photodiode 15 for sensing light, and a transfer transistor 20 that transfers the charges generated by the photodiode 15. Additionally, a reset transistor Rx periodically resets a floating diffusion region FD that stores the transferred charges, and a source follower SF buffers the signals resulting from the charges stored in the floating diffusion region FD. The source follower SF shown in FIG. 1A consists of two serially-connected MOS transistors M1 and R1.
FIG. 1B shows a unit pixel of the CMOS image sensor of FIG. 1A having a photodiode and a transfer transistor integrated on a semiconductor substrate. Referring to FIG. 1B, the photodiode 15 and the transfer transistor 20 are formed in a semiconductor substrate 11 such that photodiode 15 is disposed on one side of the transfer transistor 20. As is well known in the art, the transfer transistor 20 includes a gate electrode 21 and source and drain regions 22a and 22b. In this case, the source and drain regions 22a and 22b are respectively connected to an external power source (not shown) using multilayered metal interconnects 25 and 30. An interlayer insulating layer 23 is interposed between the first metal interconnect 25 and the semiconductor substrate 11, and an interlayer insulating layer 27 is interposed between the first metal interconnect 25 and the second metal interconnect 30. A light blocking layer 40 is formed on the resultant structure having the transistor 20. The metal interconnects 25 and 30 may be formed of a conductive shielding material, e.g., aluminum.
Reducing pixel size to increase pixel density in the CMOS image sensor leads to crosstalk between adjacent pixels. The crosstalk is caused by the migration of electrons {circle around (e)} produced within the semiconductor substrate 11 due to the photoelectric effect resulting from an incoming optical energy and by inclined incident light 50 with wide angle of incidence (i.e., the angle between the incident light direction and normal direction of metal interconnects 25 and 30). The crosstalk resulting from the electron migration caused by the photoelectric effect is generally negligible, but the inclined incident light 50 can result in large crosstalk in the CMOS image sensor.
Referring to FIG. 1B, the inclined incident light 50 causes crosstalk because the metal interconnects 25 and 30 and the light blocking layer 40 disposed around the corresponding photodiode 15 mutually reflect the incident light 50 incident on another photodiode disposed adjacent to photodiode 15. Such crosstalk makes it difficult to recover data from individual pixels, and may cause color blurring or may brighten the surrounding area when picking up a bright image, making it difficult to accurately capture the image.
The CMOS image sensor includes a color filter to realize a color image. The color filter has R (red), G (green), and B (blue) unit filters. The respective unit filters are arranged to correspond to respective unit pixels.
The R, G, B filters absorb light of specific wavelengths. In particular, the R filter absorbs the long wavelength light of about 660˜700 nm, the G filter absorbs the intermediary wavelength light of about 510˜590 nm, and the B filter absorbs the short wavelength light of about 490˜510 nm. Therefore, since the R filter absorbs light with the long wavelength, electron-hole recombination is carried out deeply in the substrate (photodiode region) in view of the unit pixel (hereinafter referred to as an “R filter region”) having the R filter, and no electron-hole is lost to allow for excellent photo sensitivity.
However, because the B filter absorbs the short wavelength light, the electron-hole recombination is at the surface of the photodiode, i.e., a p-type photodiode region. As is well known in the art, the p-type photodiode region is provided to remove a dark source of the CMOS image sensor rather than sensing the light. Accordingly, negligible photogenerated electron-hole recombination occurs to significantly lower the photo sensitivity.
Therefore, even though the unit pixels are formed under the same condition, the light absorption of the filters causes the photo sensitivity to differ between the unit pixel with the R filter and the unit pixel with the B filter. Accordingly, in the conventional CMOS image sensor, photo sensitivity is inconsistent.