1. Field
The present disclosure relates generally to an improved data processing system and in particular to a method and apparatus for processing image data. Still more particularly, the present disclosure relates to a method, apparatus, and computer usable program code for focusing an image using data generated by a synthetic aperture radar.
2. Background
Synthetic aperture radar (SAR) is a form of radar in which a beam is generated that may be used for applications such as, for example, remote sensing and mapping. In one application, a radar antenna may be attached to an aircraft. A single pulse or signal may be radiated from the antenna that may have different dimensions in the horizontal and vertical direction to achieve desired beam width. Often times, the signal radiated from the antenna may illuminate the terrain below the aircraft outwards towards the horizon. The amplitude and phase of the signal returning from a given location on the ground may be recorded.
Further, the aircraft may transmit a series of pulses as the aircraft travels. The results from these pulses may be combined. In other words, a series of different observations or phases may be combined through different processes to generate an image. This image may be, for example, a map of reflectivity including both amplitude and phase. In these examples, phase information is typically discarded. The amplitude information may contain information about ground cover in the same manner as a black and white picture.
Synthetic aperture radar images may be smeared in the cross range direction due to residual phase error. This smearing is also referred to as being defocused. The cross range direction is also referred to as an azimuth direction. The azimuth direction is perpendicular to the range from the synthetic aperture radar system to the target. This residual phase error may be caused by incomplete motion compensation, atmospheric effects, or some other source.
Autofocus is a function that may be used to produce focused images by estimating and correcting the residual phase error. Phase error may be introduced by navigation errors or other sources that may degrade the quality of the image. In these examples, autofocus may be used during image formation for estimation and compensation of the residual phase error that cannot be completely removed through motion compensation based on navigation data.
Various autofocus algorithms and techniques are currently available. These techniques, however, may be limited in performance or applicable situations. For example, map drift and phase difference methods are examples of parametric approaches that may be used to perform autofocus for processing radar data to create images. These types of methods measure the relative shifts of maps from sub aperture data blocks and use these measurements to estimate non-linear phase error based on piece-wise linear approximations.
Other methods include non-parametric methods, such as, for example, phase gradient algorithm and multiple discrete autofocus. These methods are simple, efficient, and capable of estimating high phase error. The performance of these types of techniques may be affected by the contents in the scene in which the image is made. Other examples of currently available techniques also include successive parameter adjustments, which uses a one dimensional search algorithm to update model parameters to the direction of improving image sharpness metric.
Map drift and phase difference methods are less sensitive to scene contents. These types of methods are limited in the order of phase error that may be estimated. For higher order phase error, full collection array data is divided into small data blocks. This processing leads to a reduction of signal-to-noise ratio. The maximal order that can be estimated is practically limited to around five with this type of technique.
Phase gradient algorithm and multiple discrete autofocus calculate the phase error profile from isolated point targets. However, with these techniques, distinguishing targets closely located in the azimuth from isolated targets cannot be perfect and using non-isolated targets may result in estimation error.
Another situation in which these techniques have difficulty is when the phase error varies spatially. As a result, the chances of having good isolated targets in a small image block are low. This type of situation may result in performance degradation. Further, inaccurate local phase error estimation also poses a problem in achieving phase continuity that is desired between divided small image blocks.
Therefore, it would be advantageous to have a method, apparatus, and computer program product that overcomes the problems described above.