1. Field of the Invention
The present invention relates generally to CMOS image sensors, and more particularly to a sensor, system and method for utilizing pixels with long exposure times and pixels with short exposure times on the same sensor to provide a sensor with a wide dynamic range.
2. Description of the Related Art
In digital photography, rarely are the lighting conditions ideal. The scene to be photographed may be too dark, too light, too diverse, or too dynamic, such as when a cloud quickly blocks the sun. In order to adjust to different lighting conditions, an image sensor needs to have a wide dynamic range. That is, the sensor should be able to adjust to the current lighting conditions to maximize the details in the image, and thereby produce the best possible picture.
Many scenes contain useful information in both shadow and bright areas which are traditionally very difficult to image at the same time. A recurring prior art solution is to take multiple images of the same scene using different exposure times. For example, as discussed in A. El Gamal, High-Dynamic-Range Image Sensors, tutorial presented at IEEE Int. Solid State Circuits Conf., February 2002., a sensor could take three pictures—one with a short exposure time, one with a medium exposure time, and one with a long exposure time. Another example is disclosed in U.S. Pat. No. 5,264,940 by Komiya, which also describes the benefit of accumulating an image more than once using different exposures. Certain features in the scene which may be overexposed in the long exposure frame are visible in the short exposure image. Similarly, scene features which are too dark, or not visible at all, in the short exposure image, are visible as a result of the longer exposure time. The images may be combined using image processing techniques to produce a processed image that contains more visual details than any of the single images alone. However, a drawback to such an approach is that the processed images often contain motion artifacts caused by movement in the scene between the different exposures. Another major drawback is that the different exposures are taken at different times and can thus wholly represent different images. U.S. Pat. No. 4,584,606 disclosed the use of multiple sensors to simultaneously accumulate the different exposures at generally the same moment in time.
Another solution is described in Nayar, S. K. Mitsunaga, T., “High dynamic range imaging: spatially varying pixel exposures,” IEEE Conference on Computer Vision and Pattern Recognition, Vol. 1, pp. 472-479, June 2000. ISBN: 0-7695-0662-3. According to this approach, an array of neutral density filters with different opacity are deposited on a sensor so that pixels with darker filters sample brighter portions of a scene, while pixels with lighter filters capture lower light features (see FIG. 1). The information can be combined using low pass filtering or techniques such as cubic interpolation. The problem with such a solution is that the dynamic range of the sensor is effectively fixed by the selected filters and filter pattern, regardless whether the selected filter pattern also includes a color structure, such as Bayer filter matrix, or is monochromatic with a structure having alternating optical transmittance.