Streak Tube Imaging Lidar (STIL) is used for the imaging and detection of targets in a turbid medium such as water or air. In terms of imaging an underwater region, STIL systems use a pulsed fan beam to illuminate a thin strip on the ocean bottom that corresponds to a single (cross-track) line in the rendered imagery. Photon returns of the ocean bottom and corresponding adjacent water column are captured in a charge coupled device (CCD) array, essentially capturing a thin volumetric slice, or shot of the ocean in terms of volumetric pixels or voxels as they are known. Each pixel can be represented by a temporal profile which contains an approximate Gaussian shaped curve of the bottom return.
The three-dimensional nature of STIL data yields large data files which require relatively large amounts of time to process and/or transmit. Accordingly, the three-dimensional STIL data is often rendered into two-dimensional forms thereof. In general, STIL data can be rendered into two-dimensional contrast and range maps by finding the peak value of the temporal profile of each pixel. The magnitude of the peak value corresponds to the contrast and the location of the peak value corresponds to the range.
Current methods of rendering involve searching the temporal profile for a matched Gaussian curve fit. However, this technique is limited in that the prior estimates of the Gaussian curve width must be known. That is, problems arise when the true width of the Gaussian curve in the temporal profile is different from the fitted width. In addition, prior to the Gaussian curve match fit, dark field and ambient light fields must be determined and removed from the temporal profile. These rendering methods use a separate dark field file to remove the dark field values and must estimate a Ksys value to compute the ambient light field exponential decay curve. However, this means that a separate dark field file must be provided. Also, poor estimates of the Ksys value yield poor rendering results.
Another method of rendering is disclosed in U.S. patent application Ser. No. 10/429,330 entitled “RENDERING THREE-DIMENSIONAL STREAK TUBE IMAGING LIDAR (STIL) DATA TO TWO-DIMENSIONAL CONTRAST AND RANGE MAPPINGS THEREOF”, filed Apr. 28, 2003. This method involves processing a temporal portion of the STIL data for each pixel of the three-dimensional image. Each temporal portion includes a Gaussian-like portion and non-Gaussian-like portions. Processing for each pixel includes: (i) removing noise from the temporal portion using the non-Gaussian-like portions resulting in the formation of a filtered form of the Gaussian-like portion, (ii) determining a non-integer center of mass of the filtered form of the Gaussian-like portion with the non-integer center of mass being indicative of a range value, and (iii) applying a mathematical interpolation function to determine amplitude of the filtered form of the Gaussian-like portion at the range value defined by the non-integer center of mass. The amplitude so-determined is indicative of a contrast value. The process is repeated for all pixels to thereby produce range and contrast maps.
In general, rendered two-dimensional contrast and range data is subject to the following three effects:                (1) CCD array effects cause forward direction bands to form in the images (both contrast and range);        (2) The STIL system's laser experiences jitter, the effects of which cause cross-track direction banding in both the contrast and range images; and        (3) Intensity roll-off in the cross track direction (multiplicative for contrast and additive for range) from uneven photon path propagation due to the STIL system's wide field-of-view.        
The current method for correcting CCD array effects and contrast intensity roll-off involves normalizing the contrast and range maps with the image “column profile” (i.e., an array of average intensity values taken along the columns of pixel data). The contrast map is normalized by dividing the contrast image columns by the (contrast) column profile to correct multiplicative effects associated with contrast data. The range map is normalized by subtracting the (range) column profile from the range image columns to correct additive effects associated with range data. This corrects both the contrast intensity roll-off and the CCD array banding simultaneously. However, the disadvantage of this technique is that substantial shadowing effects occur about objects within the image scene in cluttered environments. This detracts from the enhancement of the image scene and can cause significant problems with automatic target recognition algorithms.
The current method for correcting jitter effects involves normalizing the image rows with the image “row profile” (i.e., an array of average intensity values taken along the rows of pixels of data). The contrast map is normalized by dividing the image rows by the contrast image row profile (to correct multiplicative effects), and the range map is normalized by subtracting the range image profile from the image rows (to correct additive effects from the range image rows). Again, this leads to substantial shadowing effects about objects within the image scene in cluttered environments which, in turn, causes problems with automatic target recognition algorithms.
The current method of enhancement also has significant contrast problems as the image scene changes its background illumination and as the objects within the image scene vary from high contrast to low contrast. This results in either the saturation of high contrast objects so that low contrast objects are visually visible, or the preservation of high contrast objects at the expense of obscuring low contrast objects from visibility.