This relates generally to imaging sensors, and more particularly, to imaging sensors with pixels that include more than one photosensitive region.
Modern electronic devices such cellular telephones, cameras, and computers often use digital image sensors. Imagers (i.e., image sensors) include a two-dimensional array of image sensing pixels. Each pixel includes a photosensor such as a photodiode that receives incident photons (light) and converts the photons into electrical charges. An image sensing pixel in the two-dimensional array of image sensing pixels includes a single photosensitive region and a color filter formed over the photosensitive region.
When viewed as a whole, the array of color filters associated with the array of image sensing pixels in the image sensor is referred to as a color filter array. Ideally, photosensitive regions associated with a pixel having a red color filter would only be exposed to light that has passed through a red color filter, photosensitive regions associated with a pixel having a green color filter would only be exposed to light that has passed through a green color filter, and photosensitive regions associated with a pixel having a blue color filter would only be exposed to light that has passed through a blue color filter, etc.
However, there is often undesired optical cross-talk between adjacent pixels associated with different colors (i.e., having color filters of different colors). Optical cross-talk can degrade the output image quality of an imager.
It would therefore be desirable to be able to provide improved image pixels for imaging devices.