An image sensor is a device that can convert an optical image into an electronic signal. Image sensors are often times utilized in still cameras, video cameras, video systems, and other imaging devices. Cameras and other imaging devices commonly employ either a charge-coupled device (CCD) image sensor or a complementary metal-oxide-semiconductor (CMOS) image sensor.
CMOS image sensors include an array of pixels, each of which can comprise a photodetector. CMOS image sensors also include circuitry to convert light energy to an analog voltage. Moreover, CMOS image sensors can include additional circuitry to convert the analog voltage to digital data. Thus, a CMOS image sensor can be an integrated circuit that comprises various analog, digital, mixed-signal, etc. components associated with capturing light and processing imaging related information; accordingly, a CMOS image sensor can be a system on chip (SoC). For example, components integrated into the CMOS image sensor oftentimes include a processor module (e.g., microprocessor, microcontroller, or digital signal processor (DSP) core), memory, analog interfaces (e.g., analog to digital converters, digital to analog converters), and so forth.
Visible imaging systems implemented using CMOS image sensors can reduce costs, power consumption, and noise while improving resolution. For instance, cameras can use CMOS image sensors that efficiently marry low-noise image detection and signal processing with multiple supporting blocks that can provide timing control, clock drivers, reference voltages, analog to digital conversion, digital to analog conversion, key signal processing elements, and the like. High-performance video cameras can thereby be assembled using a single CMOS integrated circuit supported by few components including a lens and a battery, for instance. Accordingly, by leveraging CMOS image sensors, camera size can be decreased and battery life can be increased. Also, dual-use cameras have emerged that can employ CMOS image sensors to alternately produce high-resolution still images or high definition (HD) video.
Image sensors oftentimes have defect pixels, which can appear as undesirable outliers when a final image is formed. Defect pixels can have a variety of root causes such as high dark current, faulty transistors, or the like. Moreover, a number of defect pixels may change from image to image as a function of conditions in which a camera is operated such as scene, camera temperature, amount of light, and so forth. For instance, an image obtained by an image sensor may have 1000 defect pixels and a different image obtained by the same image sensor may have 100 defect pixels.
Conventional image processing pipes oftentimes include some type of static pixel correction or dynamic pixel correction. However, performances of conventional approaches vary widely from algorithm to algorithm. For instance, typical static pixel correction approaches are oftentimes inflexible, and thereby unable to mitigate defects that occur intermittently. Moreover, conventional dynamic pixel correction techniques commonly attempt to detect defective pixels as an image is processed, and replace pixels detected to be defective with fair approximations based on a remainder of the image.