Color digital imaging systems, such as digital cameras, typically employ a single image sensor, such as a charged coupled device (CCD) or a complementary metal oxide semiconductor (CMOS) device, to digitally capture a scene of interest. Image sensors typically include an array of optical detectors, such as photodiodes, that generate an electrical response in proportion to the intensity of incident light. The dynamic range of individual optical detectors is defined by the minimum amount of light that is required to generate an electrical response at the low end and the maximum amount of light beyond which the electrical response of the optical detector does not change (i.e. a saturation point) at the high end.
The dynamic range of an image sensor is an important characteristic when capturing high contrast images. When bright and/or dark areas of an image exceed the dynamic range of an image sensor, the quality of the captured image may be degraded. If the sensitivity of the image sensor is adjusted, such as by decreasing the exposure time to sufficiently capture the features of the bright areas in an image, then the dark features are not captured sufficiently.
One technique for capturing high contrast images with a digital sensor involves capturing two images of the same scene in rapid succession, with the sensitivity of the image sensor set to capture the bright areas in a first image and the dark areas in a second image. The two images may then be used to create a composite image that includes the features of both the bright and dark areas.
Although the two-image technique may extend the dynamic range of an image sensor, changes in the scene between the time of capturing the first and second images may introduce motion artifacts that degrade the quality of the combined image.