Imaging through a random media such as the atmosphere or a volume of water results in images that are deteriorated for two different reasons: (1) optical wave propagation through a turbulent media induces wavefront aberrations that cause images to be distorted randomly and restrain imaging systems from achieving diffraction-limited performance, and (2) imaging conditions such as low light level, haze, dust, aerosol pollution, etc. usually result in images that are noisy, with reduced contrast and visibility. The combination of both deteriorating factors causes severe performance limitations to imaging systems operating in such conditions.
Over time, a number of techniques had been used to compensate for turbulence-induced aberrations. Among electro-mechanical solutions to the problem, the most significant is conventional adaptive optics (AO) [U.S. Pat. Nos. 5,046,824; 5,026,977; 5,684,545], a technique developed originally for astronomical observations. Conventional AO successfully achieves near-diffraction-limited imaging, but suffers from anisoplanatism which restricts the correctable field-of-view (FOV) to small angular extents. Though multiple-guide-star AO and multi-conjugate AO systems [U.S. Pat. No. 6,452,146] attempted to extend the FOV, it angular extent is still limited to value typically in the order of 1/10th degree.
Based on a different approach, a number of digital processing techniques had been developed and demonstrated image quality improvements in the case of weak anisoplanatism conditions (narrow FOV) but generally fail otherwise. Techniques based on block-processing (or mosaic processing) can reconstruct images over anisoplanatic FOV's but usually require the knowledge of the point spread function (PSF) which is unavailable in most applications. Another approach, referred to as “lucky” imaging, consists in selecting best quality frames from a stream of short-exposure images using an image quality metric. The problem with that approach is the low probability of appearance of a good quality image under anisoplanatic conditions.
Techniques referred to as synthetic imaging or lucky-region fusion (LRF) which overcome most shortfalls of techniques previously mentioned and compensates turbulence-induced distortions while succeeding under anisoplanatic conditions had been developed. In fact, the LRF method has essentially no limitation to its effective FOV and performs successfully over angular extents hundreds of times larger than the isoplanatic angle. The techniques consist in fusing best quality regions within a stream of short-exposure images based on their local image quality. It owes its robustness during operation under anisoplanatic conditions to the use of a tool which characterizes locally the quality of an image: an image quality map (IQM).
Though a number of other fusion techniques exist, they do not aim to mitigate random image distortions U.S. Pat. Nos. 4,661,986; 5,140,416; 5,325,449; 5,881,163; 6,201,899; 6,320,979; 6,898,331; 7,176,963. Additionally, they typically require either two or more image sensors either special hardware such as moving lenses or moving sensor for example. On the contrary, the LRF technique successfully mitigates random image distortions and has the advantage to require only one image sensor to collect a stream of randomly-distorted images.
The downfall of most image processing techniques is to operate directly on the raw data stream collected by the image sensor(s) and their performance therefore depends strongly on the imaging conditions such as the light level, aerosol pollution, dust, haze, and other deteriorating factors.
The present invention includes a step prior to applying the LRF algorithm specifically designed for enhancing image quality in the raw data stream that are most critical to a successful fusion. It especially mitigates the effect of low light level, dust, haze, aerosol pollution, and other deteriorating factors.