Electronic devices, such as cellular telephones, cameras, and computers, commonly use image sensors to capture images by sensing light. A typical imager sensor includes a focal plane array of pixels, and each pixel includes a photosensor, such as a photogate, photoconductor, or photodiode, for accumulating photo-generated charge in a portion of the substrate. When photons impinge on the photosensor, electron-hole pairs are generated. Conventional image sensors convert the electrons that are integrated (collected) in the pixels into a voltage, and the holes are generally discarded into the substrate.
Typical CMOS image sensors have scene capture ranges within a range of 40-60 dB. This range is less than a human eye intra-scene capture range. A solution to this problem is high dynamic range (HDR) scene luminance that has a capture range of about 60-120 dB.
Current efforts to provide HDR scene luminance include intra-frame multi-exposure and interlaced exposures and have met with varying degrees of success. For example, several methods being used to provide HDR suffer from high memory requirements resulting from the need to store additional exposures, high power requirements, loss in resolution, and motion artifacts.