1. Field of the Invention
This invention relates to imaging systems, and more particularly to relaxing limitations on scene motion across the imager to address issues such as imaging system pointing and stabilization limitations, non-uniformity compensation (NUC), dead pixel clumps, burn-in mitigation, field-of-view (FOV) extension, super-sampling, pixel-phase induced noise in frame summing, and staying within the limits of optical stabilization actuators.
2. Description of the Related Art
Imaging systems image a field-of-view (FOV) of a scene along a line-of-sight (LOS) onto an image detector that captures images a certain frame rate. The “scene” is what the imaging system is looking at such as an object of interest in a background. It is well known and widely accepted that any motion of the scene across the image detector produces smearing of the scene in the image during the frame. Such motion and smearing are tightly coupled. The higher the rate of motion the greater the amount of smearing. The tradeoff of rate of motion vs. smearing constitutes a system level trade space.
Motion of the scene across the image detector has two components. A first motion component represents motion that the system would like to remove during frame integration. This component may include intentional motion of the imaging system across the scene (e.g. LOS scan to cover a larger area across the frame), unintentional motion of the image detector across the scene (e.g. platform jitter) and scene motion (e.g. the object of interest is moving relative to the background). A second motion component represents intentional scene motion across the imaging detector (as opposed to across the scene) that must not be removed. This component may be induced for such purposes as enabling pixel non-uniformity compensation, mitigating the effects of dead pixel clumps in the imager or the effects of burn-in such as vidicon ghosting, or for enabling effective FOV enlargement across a sequence of frames (as in step-stare, TDI, or non-TDI scanners).
The imaging system may implement optical-stabilization to remove the first motion component. The system measures the unintentional and intentional motion of the detector across the scene and subtracts the motion from the estimated scene motion to produce an actuation signal. The actuation signal drives an actuator to control LOS to cancel the first motion component. The actuator may reposition the image detector, one or more optical components of the system's optical focusing system or a gimbal on which the imaging system is mounted. Because the actuator has a limited dynamic range, it may need to be continuously re-centered to stay within actuation limits. Re-centering produces intentional motion of the scene across the detector.
Ideally, optical stabilization removes all unintentional sources of scene motion across the image detector. But it does not, in fact must not, remove the second motion component representing intentional scene motion across the imager from frame-to-frame. Therefore, in existing practice, the system designer must always balance the inherent trade-off of smearing vs. rate of intentional motion across the image detector.
In certain imaging systems the frame rate is increased to relax the motion vs. smear tradeoff, and the consequent signal-to-noise (SNR) loss is mitigated by registering and summing multiple frames, producing a sum-image having a higher SNR ratio than the individual frames. For example, when the frame rate is increased by 5× and sum-frames are produced at the previous frame rate, the single-frame smear is reduced by about a factor of five. Frame summing is sometimes done to improve SNR induced by other system limitations, such as single-frame integration time, but it is still limited by registration and smear.
Frame summing also produces another system sensitivity. The same motion that produces smear causes shifts in the sub-pixel phase of the image from frame-to-frame. Since the individual images are inherently pixelized, this produces an effective misregistration of the images to be summed, which causes the summing to increase the effective smearing. It also increases effective noise based on the non-repeatability of the smearing function (of the sub pixel phase histories) from sum frame to sum frame. In some cases, this blurring offsets much of the SNR benefit of summing multiple images together.
The “register and sum” function may be performed off of the image detector, in a computer or in dedicated logic, or may be performed on the image detector (e.g. orthogonal transform charge coupled device (CCD) or time domain integration (TDI) CCD). The advantage of performing the register and sum in the detector is a much higher frame rate (per summed frame), limiting the time over which smear occurs, and consequently the smear. But, this cannot fully eliminate the smear from intentional and necessary motion.
Conventional optical stabilization cannot fix remaining smear either, because as stated previously this would require canceling the intentional motion. For example, scanning TDI uses registered “frames” of a multi-column detector to improve performance over a single-column detector, but relies on image motion across the detector to scan the scene. Complete stabilization would cancel this motion. Thus, the designer is always left with at least one frame of smear embedded the summed frame.