A microbolometer is a type of uncooled sensor for detecting infrared (IR) radiation. Conventional microbolometers include a focal plane array (FPA) of detector elements, or pixels, each of which measures a change in electrical resistance while being exposed to thermal radiation. The change in resistance is converted into an image representing a temperature corresponding to the wavelength of the radiation. Unlike images produced from light in the visible spectrum, infrared images have low contrast and must undergo significant processing (including, e.g., amplification) to produce an image that is visually pleasing to the human eye. Furthermore, materials currently used to produce uncooled IR detectors are less mature that those used for detecting visible light, and inherent non-uniformities in pixel responsivity and offset often exist within conventional focal plane arrays. When processing IR images, these non-uniformities may be amplified; this may produce undesirable results. Nonuniformity correction (NUC) algorithms are typically used by an image preprocessor to remove variations caused by the non-uniformities, as well as to compensate for the temperature of the microbolometer, improving the resulting image.
Conventional uncooled microbolometers typically use Field Programmable Gate Arrays (FPGA) to perform at least some of the image preprocessing. The FPGA may be incorporated into the FPA. Due to their complexities, NUC and contrast enhancement algorithms use significantly more processing power than, for example, a conventional video encoding algorithm. Once an FPGA design is finalized, the principal way to reduce power consumption is through the costly process of converting the FPGA design into an application-specific integrated circuit (ASIC).