Sensor arrays are used in a wide variety of optical devices. A sensor array contains an array of pixels which are typically responsive to visible light and, to a lesser degree, to infrared light. Thus, a sensor array will produce a monochromatic signal representative of all colors of visible light, with each pixel producing a signal indicative of light incident on that pixel.
Any color, within limits, may represented by a linear combination of three additive primary colors, such as red, green, and blue. To enable a sensor array to sense color, a color filter array having red, green, and blue filter elements is overlaid on the sensor array so that each filter element of the color filter array is aligned with one pixel of the sensor array.
FIG. 1 shows a color sensor array including a color filter array 10 having red elements 12, green elements 14, and blue elements 16 arranged in what is known as a Bayer pattern, which is named after its inventor and is disclosed in U.S. Pat. No. 3,971,065. The Bayer pattern is a repeating pattern of a 2×2 array of color filter elements including one red element 12, two green elements 14, and one blue element 16. Thus, there are twice as many green elements 14 as there are red elements 12 or blue elements 16, which takes into account the fact that the human eye is more sensitive to green light than it is to red or blue light. The color filter array 10 is overlaid on a sensor array 18. Color sensor arrays embodying the basic structure shown in FIG. 1 and variations thereof are manufactured by many companies.
The red elements 12 block green light and blue light and allow only red light to reach the corresponding pixels of the sensor array 18, which therefore output only red color components. The green elements 14 block red light and blue light and allow only green light to reach the corresponding pixels of sensor array 18, which therefore output only green color components. The blue elements 16 block red light and green light and allow only blue light to reach the corresponding pixels of the sensor array 18, which therefore output only blue color components. Thus, two-thirds of the light incident on the color filter array 10 is blocked from reaching the sensor array 18, significantly reducing the overall detection sensitivity of the sensor array 18 for a color image.
If there are n×m pixels in the sensor array 18 (for example, 480×640 pixels), the resolution of the sensor array 18 for a monochromatic image is m×n (for example, 480×640 pixels). However, one 2×2 array of color filter elements including one red element 12, two green elements 14, and one blue element 16 in the color filter array 10 is required to detect one pixel of a color image. Thus, the resolution of the combination of the color filter array 10 and the sensor array 18 for a color image is m/2×n/2 (for example, 480/2×640/2=240×320), thereby significantly reducing the resolution of the sensor array 18 for a color image.
Furthermore, since each pixel in the sensor array 18 outputs only one of the color components red, green, and blue, it is necessary to interpolate the other two color components for that pixel using a demosaicing algorithm. Many such algorithms are known in the art. However, all of these algorithms introduce demosaicing artifacts into the interpolated color image, which degrade the quality of the image.
FIG. 2 shows a portion of a color sensor array 20 disclosed in U.S. Pat. No. 5,965,875 in which red, green, and blue pixels are stacked vertically in an attempt to solve the problems of reduced sensitivity and reduced resolution for a color image which are exhibited by the color sensor array shown in FIG. 1. Color sensor arrays embodying the basic structure shown in FIG. 2 are manufactured by Foveon, Inc., of Santa Clara, Calif., and include the Foveon F7X3-C9110 and FO18-50-F19 X3 direct image sensors.
The color sensor array 20 is based on the fact that light incident on the surface of a silicon substrate penetrates into the silicon substrate where it is absorbed over a characteristic absorption depth that depends on the wavelength of the light, and increases as the wavelength increases. Thus, blue light is absorbed over a first characteristic absorption depth, green light is absorbed over a second characteristic absorption depth deeper than the first characteristic absorption depth, and red light is absorbed over a third characteristic absorption depth deeper than the second characteristic absorption depth.
The color sensor array 20 includes a P-type silicon substrate 22. An N-type doped well region 24 is formed in the P-type silicon substrate 22 and forms a pn junction 26 with the P-type silicon substrate 22 at the characteristic absorption depth of red light. Thus, the pn junction 26 acts as a red-sensitive photodiode and detects red light incident on the color sensor array 20.
A P-type doped well region 28 is formed in the N-type doped well region 24 and forms a pn junction 30 with the N-type doped well region 24 at the characteristic absorption depth of green light. Thus, the pn junction 30 acts as a green-sensitive photodiode and detects green light incident on the color sensor array 20.
An N-type doped well region 32 is formed in the P-type doped well region 28 and forms a pn junction 34 with the P-type doped well region 28 at the characteristic absorption depth of blue light. Thus, the pn junction 34 acts as a blue-sensitive photodiode and detects blue light incident on the color sensor array 20.
A blue current detector conceptually shown in FIG. 2 as a blue current meter 36 is connected across the pn junction 34 acting as a blue-sensitive photodiode to detect a blue current IB. A green current detector conceptually shown in FIG. 2 as a green current meter 38 is connected across the pn junction 30 acting as a green-sensitive photodiode to detect a green current IG. A red current detector conceptually shown in FIG. 2 as a red current meter 40 is connected across the pn junction 26 acting as a red-sensitive photodiode to detect a red current IR.
The color sensor array 20 may be considered to be composed of a blue pixel stacked on a green pixel stacked on a red pixel. However, this stacked structure makes the color sensor array 20 susceptible to crosstalk between the pixels.
For example, the green and red light pass through the blue pixel on the way to the green and red pixels. Although most of the green and red light will pass through the blue pixel without being absorbed because the blue pixel is at a shallower depth than the characteristic absorption depths of the green and red light, a certain portion of the green and red light will nevertheless be absorbed in the blue pixel. Thus, the blue current IB produced by the blue pixel is not an accurate representation of the blue light incident on the color sensor array 20 because a portion of the blue current IB was generated by the green and red light incident on the color sensor array 20.
Likewise, the red light passes through the green pixel on the way to the red pixel. Although most of the red light will pass through the green pixel without being absorbed because the green pixel is at a shallower depth than the characteristic absorption depth of the red light, a certain portion of the red light will nevertheless be absorbed in the green pixel. Thus, the green current IG produced by the green pixel is not an accurate representation of the green light incident on the color sensor array 20 because a portion of the green current IG was generated by the red light incident on the color sensor array 20, and a portion of the green light was absorbed in the blue pixel before it could reach the green pixel.
Finally, the red current IR produced by the red pixel is not an accurate representation of the red light incident on the color sensor array 20 because a portion of the red light was absorbed in the blue and green pixels before it could reach the red pixel.
Furthermore, the sizes of the blue, green, and red pixels are different, with the blue pixel being smaller than the green pixel and the green pixel being smaller than the red pixel. Thus, a portion of the blue light incident on the color sensor array 20 falls outside the blue pixel, and a portion of the green light incident on the color sensor array 20 falls outside the green pixel, which reduces the sensitivity of the color sensor array 20 to blue and green light and makes the blue current IB and the green current IG smaller than they should be, thereby introducing additional errors into the blue current IB and the green current IG.
Also, although the overall size of the vertically stacked blue, green, and red pixels shown in FIG. 2 is smaller than the overall size of the 2×2 array of one red element 12, two green elements 14, and one blue element 16 in the color filter array 10 shown in FIG. 1, it is larger than one pixel of the sensor array 18 shown in FIG. 1. Thus, although the resolution of the vertically stacked blue, green, and red pixels shown in FIG. 2 is greater than the resolution of the color sensor array shown in FIG. 1 for a color image, it is smaller than the resolution of the sensor array 18 shown in FIG. 1 for a monochromatic image.
Accordingly, it would be desirable to have a color sensor array with vertically stacked blue, green, and red pixels of equal size no larger than one pixel of a monochromatic sensor array having reduced crosstalk between the pixels.