The present invention relates to the field of three-dimensional image capture and in particular to the capture of a color texture image in conjunction with a scannerless range imaging system.
Standard image capture systems will capture images, such as photographic images, that are two-dimensional representations of the three-dimensional world. In such systems, projective geometry best models the process of transforming the three-dimensional real world into the two-dimensional images. In particular, much of the information that is lost in the transformation is in the distance between the camera and image points in the real world. Methods and processes have been proposed to retrieve or record this information. Some methods, such as one based on a scanner from Cyberware, Inc., use a laser to scan across a scene. Variations in the reflected light are used to estimate the range to the object. However, these methods require the subject to be close (e.g., within 2 meters) to the camera and are typically slow. Stereo imaging is a common example of another process, which is fast on capture but requires solving the xe2x80x9ccorrespondence problemxe2x80x9d, that is, the problem of finding corresponding points in the two images. This can be difficult and limit the number of pixels having range data, due to a limited number of feature points that are suitable for the correspondence processing.
Another method described in U.S. Pat. No. 4,935,616 (and further described in the Sandia Lab News, vol. 46, No. 19, Sep. 16, 1994) provides a scannerless range imaging system using either an amplitude-modulated high-power laser diode or an array of amplitude-modulated light emitting diodes (LEDs) to completely illuminate a target scene. Conventional optics confine the target beam and image the target onto a receiver, which includes an integrating detector array sensor having hundreds of elements in each dimension. The range to a target is determined by measuring the phase shift of the reflected light from the target relative to the amplitude-modulated carrier phase of the transmitted light. To make this measurement, the gain of an image intensifier (in particular, a micro-channel plate) within the receiver is modulated at the same frequency as the transmitter, so the amount of light reaching the sensor (a charge-coupled device) is a function of the range-dependent phase difference. A second image is then taken without receiver or transmitter modulation and is used to eliminate non-range-carrying intensity information. Both captured images are registered spatially, and a digital processor is used to operate on these two frames to extract range. Consequently, the range associated with each pixel is essentially measured simultaneously across the whole scene.
The preferred method of estimating the range in the ""616 patent uses a pair of captured images, one image with a destructive interference caused by modulating the image intensifier, and the other with the image intensifier set at a constant voltage. However, a more stable estimation method uses a series of at least three images, each with modulation applied to the image intensifier, as described in commonly assigned copending application Ser. No. 09/342,370, entitled xe2x80x9cMethod and Apparatus for Scannerless Range Image Capture Using Photographic Filmxe2x80x9d and filed Jun. 29, 1999 in the names of Lawrence Allen Ray and Timothy P. Mathers. In that application, the distinguishing feature of each image is that the phase of the image intensifier modulation is unique relative to modulation of the illuminator. If a series of n images are to be collected, then the preferred arrangement is for successive images to have a phase shift of {fraction (2xcfx80/n)} radians (where n is the number of images) from the phase of the previous image. However, this specific shift is not required, albeit the phase shifts need to be unique. The resultant set of images is referred to as an image bundle. The range at a pixel location is estimated by selecting the intensity of the pixel at that location in each image of the bundle and performing a best fit of a sine wave of one period through the points. The phase of the resulting best-fitted sine wave is then used to estimate the range to the object based upon the wave-length of the illumination frequency.
An image intensifier operates by converting photonic energy into a stream of electrons, amplifying the energy of the electrons and then converting the electrons back into photonic energy via a phosphor plate. One consequence of this process is that color information is lost. Since color is a useful property of images for many applications, a means of acquiring the color information that is registered along with the range information is extremely desirable. One approach to acquiring color is to place a dichromatic mirror in the optical path before the microchannel plate. Following the mirror a separate image capture plane (i.e., a separate image sensor) is provided for the range portion of the camera and another image capture plane (another sensor) is provided for the color texture capture portion of the camera. This is the approach taken by 3DV Technology with their Z-Cam product. Besides the added expense of two image capture devices, there are additional drawbacks in the need to register the two image planes precisely, together with alignment of the optical paths. Another difficulty is collating image pairs gathered by different sources.
Another approach is described in detail in commonly assigned copending application Ser. No. 09/572,522, entitled xe2x80x9cMethod and Apparatus for a Color Scannerless Range Image Systemxe2x80x9d and filed May 17, 2000 in the names of Lawrence Allen Ray and Louis R. Gabello. In this system, a primary optical path is established for directing image light toward a single image responsive element. A beamsplitter located in the primary optical path separates the image light into two channels, a first channel including an infrared component and a second channel including a color texture component. One of the channels continues to traverse the primary optical path and the other channel traverses a secondary optical path distinct from the primary path. A modulating element is operative in the first channel to receive the infrared component and a modulating signal, and to generate a processed infrared component with phase data indicative of range information. An optical network is provided in the secondary optical path for recombining the secondary optical path into the primary optical path such that the processed infrared component and the color texture component are directed toward the single image responsive element. While this approach avoids the added expense of two image capture devices, there continues to be the need to register the two image planes precisely, together with alignment of the optical paths.
Another approach is to capture an image bundle by using two interchangeable optical assemblies: one optical assembly for the phase image portion and a separate optical element for the color texture image portion. This approach is described in detail in commonly assigned copending application Ser. No. 09/451,823, entitled xe2x80x9cMethod and Apparatus for a Color Scannerless Range Image Systemxe2x80x9d and filed Nov. 30, 1999 in the names of Lawrence Allen Ray, Louis R. Gabello and Kenneth J. Repich. The drawback of this approach is the need to switch lenses and the possible misregistration that might occur due to the physical exchange of lens elements. There is an additional drawback in the time required to swap the two optical assemblies, and the effect that may have on the spatial coincidence of the images.
A scannerless range imaging camera may operate either as a digital camera or a camera utilizing film. In the case of a film based system there are some other requirements, particularly registration requirements, that need to be met. These requirements and means for satisfying them are described in the aforementioned copending application Ser. No. 09/342,370. As mentioned above, the drawback of such a camera system, including a film-based system, is its inability to capture a color image.
In most conventional digital imaging systems, the ability to determine color is accomplished by means of a color filter array (CFA) arranged in front of the image responsive device, such as a charge-coupled device (CCD). The CFA overcomes the CCD""s inherent inability to discriminate color, i.e., the CCD is basically a monochrome device. The manufacturing and use of a CFA is well known, see, e.g., U.S. Pat. No. 4,315,978, entitled xe2x80x9cSolid-state color imaging device having a color filter array using a photocrosslinkable barrierxe2x80x9d. This filter system, which is different from the standard filter in that the filter is an array of small color filters, is employed by a large number of digital cameras, for example the DCS-series of cameras manufactured by Eastman Kodak Company. Typically, the array is comprised of an array of 2xc3x972 pixel sub-filters. The sub-filter blocks have two diagonal cells capable of filtering green light, and one other cell is capable of filtering red light and the third is capable of filtering blue light. Upon completing the image capture a full color pixel is formed by interpolation based upon pixels capturing the desired color. Color filter arrays with more than three primary colors are also known in the art (see, e.g., U.S. Pat. No. 5,719,074, entitled xe2x80x9cMethod of making a planar color filter array for CCDs from dyed and mordant layersxe2x80x9d). In particular, this method allows for filters with any repeatable pattern of color sensitive dyes. Each of these patents are incorporated herein by reference.
It is known, in certain cases, to apply color filters to an image intensifier. In U.S. Pat. No. 4,374,325, entitled xe2x80x9cImage intensifier arrangement with an in situ formed output filterxe2x80x9d, an image intensifier device is provided with color filters on its input and output surfaces so as to intensify a color image without losing the color content. Each filter consists of an array of red, green and blue elements and these elements are precisely aligned in both input and output filters to avoid degradation of the color content. A method of producing the output filter in situ is described to provide the required accuracy of alignment. In U.S. Pat. No. 5,233,183, entitled xe2x80x9cColor image intensifier device and method for producing samexe2x80x9d, a four color system is specified in which a color image intensifier device includes infra-red filters in an RGB input matrix and a narrow band output filter is assigned to represent IR information in the RGB output matrix. In each of these cases, the output image from the intensifier is adapted for human viewing; thus the output image needs to reconverted back to a color image, and hence the need for a second color filter behind the phospher element at the output of the intensifier. In U.S. Pat. No. 5,161,008, entitled xe2x80x9cOptoelectronic image sensor for color camerasxe2x80x9d, an image sensor includes an image intensifier arranged between an interline type semiconductor sensor, coupled to the output of the intensifier, and a color stripe filter disposed in front of the photocathode such that one color stripe of the color stripe filter is associated with one column of light-sensitive elements of the semiconductor sensor. Each of these patents are incorporated herein by reference.
As mentioned above, the drawback of a range imaging camera system, including a film-based system, is the inability of current designs to capture a color image. What is needed is a convenient camera system that would avoid the aforementioned limitations and capture ranging information without sacrificing color information that would otherwise be available for capture.
An object of the invention is to obtain a color image along with range information for each point on the image.
A further object is to obtain such a color image by introducing a color filter array prior to the photo-cathode on the microchannel plate in the intensifier, where the color filter array is matched to the spatial pattern of the microchannel plate. Individual cells of the filter array are also arranged to provide a simple means of interpolation.
The present invention is directed to overcoming one or more of the problems set forth above. Briefly summarized, according to one aspect of the invention, a color scannerless range imaging system includes an illumination system for illuminating the scene with modulated illumination of a predetermined modulation frequency and an image capture section positioned in an optical path of the reflected illumination from the scene for capturing a plurality of images thereof, including (a) at least one range image corresponding to the reflected modulated illumination and including a phase delay corresponding to the distance of objects in the scene from the range imaging system, and (b) at least one other image of reflected unmodulated illumination corresponding to color in the scene. The image capture section includes a color filter array arranged with a first type of color filter that preferentially transmits the reflected modulated illumination and one or more other color filters that preferentially transmit the reflected unmodulated illumination, an image intensifier receiving the reflected illumination and including a modulating stage for modulating the reflected modulated illumination from the scene with the predetermined modulation frequency, thereby generating the range information, and an image responsive element for capturing an output of the image intensifier, including the range image corresponding to the reflected modulated image light and the other image of reflected unmodulated image light corresponding to color in the scene. The image intensifier is structured with channels that generally correspond to the color filter array such that the intensifier provides the range image from channels corresponding to the first color filter and the other image corresponding to color in the scene from channels corresponding to the one or more other color filters.
The advantage of this invention is that a single image capture system is required, thereby reducing cost, correlation and image capture variations. Moreover, the combined range and texture image is color instead of monochrome. The system does not require beam-splitters or difficult optical waveguides, and the overall system may be an attachment to a standard camera system