The invention relates to image processing and, more specifically, to an improved method and apparatus for use in scene matching navigation systems.
Image matching plays an important role in the navigation of both manned and unmanned aircraft (hereinafter airframe). Autonomous image (scene) matching navigation systems have been developed that use the optical features of the terrain to provide accurate navigation position updates and, consequently, accurate guidance.
Generally, terrain (scene) images (frames) obtained (sensed) in-flight by an on board video camera are matched (correlated) with a previously obtained and stored reference map whose position is precisely known. Once the match point between the sensed image or frame (hereinafter sensed image and frame are used interchangeably) and the map is found, the match point can be combined with sensor perspective and altitude as well as map position to update the airframe's inertial navigator. (See E. H. Conrow and J. A. Ratkovic, "Almost Everything One Needs to Know about Image Matching Systems," Image Processing for Missile Guidance. Proc. Soc. Photo-Opt. Instru. Eng. 238, pp. 426-453 (1980).) One image matching system (correlator) used in current missiles is described in J. R. Carr and J. S. Sobek, "Digital Scene Matching Area Correlator," Image Processing for Missile Guidance, Proc. Soc. Photo-Opt. Instru. Eng. 238, pp. 36-41 (1980).
As shown in FIG. 1, long before the airframe is in flight, images of scenes are obtained by reconnaissance and transmitted to a processing center. Then, during mission planning, the guidance uncertainties in both range (downtrack direction) and crossrange (guidance uncertainty plus an allowance for airframe body motion) are determined by computer simulations. Reference maps are then prepared for each scene from the previously obtained images and are sized to the mission requirements, but are always larger than the sensed image to be obtained by the airframe camera while in-flight. However, to minimize memory storage and the probability of false update, a map is only made large enough to ensure that enough sensed images will be acquired entirely within the reference map to ensure update reliability.
The airframe guidance system controls the operating period of the image matching system by the use of discrete commands issued over a serial data bus. These commands cause the image matching system to sense and correlate (match) images throughout the entirety of the range of navigational position uncertainty.
Referring again to FIG. 1, during flight, upon receipt of the appropriate command, a camera (e.g., RS-I70 video camera) on board the airframe senses the terrain below and acquires a time sequence of sensed images dependent upon the airframe motion, angular field of view (FOV), altitude, perspective, and the sensed image acquisition rate. The sensed images are either acquired with or corrected to the reference map's scale and heading; their relative positions are related to the velocity, altitude and attitude of the airframe, which quantities are measured by the airframe guidance system. Finally, the sensed images are processed to enhance features and compress the data to manageable amounts.
During processing of each original video image, a low resolution, single bit, two-dimensional match template is computed. This process may include linearly correcting sensor image brightness errors that arise from vignetting and nonuniform response in the sensor; and will include forming lower resolution pixels (called cells) by low-pass filtering and decimating the high resolution image, resulting in a field of view (FOV) with a size K .times. L cells that is limited by maximum dimensions that maximize correlation Peak-to-Sidelobe Ratio (PSR) (a measure of correlator performance) (Note, this limit on FOV results from the fact that geometry errors cause the correlation peak to degrade faster than the standard deviation of the sidelobes as the FOV size increases); filtering to enhance the salient features of the cell-sized image and to remove the image mean; and thresholding the filter output at zero for each cell to yield a zero (cell filtered gray level is less than zero) or one (cell filtered gray level is greater than or equal to zero).
The processed sensed images are then correlated with the reference map as shown in FIG. 2. Each sensed image is compared with every possible location that is entirely within the reference map and a correlation match level (amplitude) (hereinafter level and amplitude are used interchangeably) is computed for each location. The correlation levels for each sensed image together form a correlation surface (see FIG. 3 on the left; to the right is a corresponding histogram of correlation levels). Any correlation level in the correlation surface that exceeds a preset threshold is considered significant and recorded in memory along with the x,y location of the event and the sensed image number. Sensed images are numbered one (1) to N as the airframe flies through the uncertainty area that encompasses the reference map. For each sensed image correlating above a planned threshold, the x,y position of the last highest correlation is identified. These x,y positions from a sequence of sensed images are compared for consistency with the airframe motion, the motion being known by the airframe's inertial guidance unit.
A valid correlation between a sensed image and a segment of the reference map usually yields a cluster of significant correlations around the correlation peak. Conversely, an invalid sensed image resulting from it being taken outside of the map area usually produces no significant correlations. The last best correlation peak, once determined, can then be used to identify the match point between the reference map and the sensed image and provide a navigation update to the airframe guidance system, if the threshold and position-consistency criteria are met.
Image matching system update-position detectors apply a fixed correlation-peak threshold, precomputed during mission planning, to correlations. As previously noted, correlation levels for each sensed image exceeding the threshold are stored in memory, along with the position of each level. Then, the stored levels are sorted to find the greatest. Finally, two of the three update positions from a sequence of three sensed images must be consistent with the motion and attitude change of the airframe before an update is sent to the guidance system.
Although image matching systems have been implemented and are in use, problems remain. While such systems are intended to operate in many environmental conditions, it is difficult to find scenes that provide navigation updates for all times of day, seasons, and weather. In fact, the fashion in which scenes change, as seen by a particular sensor over time, is the major problem image matching systems face. These scene changes are generically known as instabilities, examples of which are:
a. shadows caused by diurnal variations in lighting that cause anti-correlations or no match at all, PA1 b. seasonal variations caused by flora cycles that may also cause anti-correlations or no match at all, PA1 c. changes in cultural features (e.g., a housing development) caused by construction or demolition, and PA1 d. precipitation where snow and standing water can cause features on the ground to be obscured or radically changed. PA1 a. low contrast scenes, caused by scene reflectance or irradiance, that cause the sensed image contrast to be low and thus sensor noise to dominate the sensed image, PA1 b. scenes with poor feature content, meaning that the observable features are too large or small and, hence, there is insufficient match information at practical resolution sizes, and PA1 c. inhomogeneous scenes, where some regions in the scene are suitable for correlation but other regions are not.
In addition to problems caused by scenes changing, the following scenes are currently unsuitable for correlation because of the nature of their fixed character:
Image matching systems need signal processing improvements that provide reliable navigation updates over degraded scenes by reducing or eliminating the effect of scene instabilities and other causes of poor correlation.