1. Field of the Invention
The present invention generally relates to pixel sensors and more particularly to CMOS image sensors.
2. Background Description
Digital cameras have largely replaced film based analog cameras, at least for amateur photography. A typical digital camera image sensor is an array of picture cells (pixels), each sensing a small fragment of the light for an entire image. Generally, the higher the number of pixels, the better the resulting images (pictures) and the larger an image may be viewed before becoming pixilated. Thus, the number of pixels is a primary measure of the image resolution, and directly affects the sharpness and crispness of the resulting images. Early digital cameras included bucket brigade sensors with Charge Coupled Devices (CCDs) for pixel sensors. Integration, power, and frame rate considerations have driven the industry to convert from CCDs to image sensors that are based on more standard CMOS logic semiconductor processes.
A typical CMOS image sensor array is, simply, an array of photodiodes with connected CMOS support and sensor circuits. Light striking each photodiode creates electron-hole pairs. The photodiode captures and stores the electrons. CMOS support circuits sense the charge stored in each diode. A color pixel sensing red, green or blue is just an appropriately filtered diode, with a red, green or blue filter to block all light outside of the particular bandwidth, i.e., red, green or blue. CMOS image sensors have allowed pixel density to increase well above 4 MegaPixels (4 MP), even as typical digital cameras have gotten more and more compact, e.g., some are even embedded in cell phones.
Unfortunately, as pixel areas have shrunk to improve density, fabricating dense CMOS image sensor arrays has become more challenging. CMOS has not been particularly suited to efficient pixel design because dense chip/array wiring formed above the array tends to block or diffuse light to the underlying pixels. CMOS device structures also overlay and tend to obstruct photo-sensor diodes (photodiode). So, polysilicon gates and array/chip wiring tend to reduce the amount of light energy reaching the photodiode. Also, the device structures and wiring limit the incident angle at which light can be collected. This is exacerbated by shrinking cell size, which is necessary for higher pixel density. Shrinking the cell requires even smaller photodiodes that are more densely packed in the pixel array.
Finally, filters in colored filtered arrays (CFA) are often physically displaced from the pixel imaging surface. This displacement causes light to diffract. Consequently, the image can smear due to light bleeding in from adjacent pixels.
Thus, there is a need for denser, simpler imaging sensors, that are easier to produce and more particularly, for denser, simpler, easier to produce CMOS pixel arrays.