Various techniques are known for acquiring three-dimensional images of a scene, i.e., images that include depth or distance information of the scene. Exemplary methods of multi-dimensional imaging include: (1) time-of-flight (i.e. light) measurement, (2) reflected radiation phase detection, and (3) stereoscopic image triangulation. Aforementioned techniques (1) and (2) generally require that the image be illuminated/pulsed, for example by a laser beam, and depth data acquired point by point across a field of view. Technique (3) requires multiple imaging sensor arrays and a relatively great amount of computational power in order to derive depth data in real-time. It is, however, desirable in many applications to acquire depth or distance information across an entire field of view simultaneously and in substantially real-time.
Range-gated cameras are known in the art. Such cameras may include a gated detector array, working in conjunction with a pulsed or stroboscopic light source, which illuminates a scene. The array may be gated, for example, by coupling a gated intensifier thereto, or by other means well known in the art. The gate timing of the detector array is delayed relative to the light source so that only light reflected from objects within a desired range of distances from the camera is captured. The camera cannot determine the distances to various objects or different points within the range.
U.S. Pat. No. 6,057,909, which is incorporated herein by reference, describes an apparatus for generating a multi-dimensional data set representing an image including distance information for objects in a scene. The apparatus including a modulated source of radiation, having a first modulation function, which directs radiation toward a scene; a detector, which detects radiation reflected from the scene, modulated by a second modulation function, and generates, responsive to said detected modulated radiation, signals responsive to the distance to regions of the scene; a processor, which receives signals from the detector and forms an image, based on the signals, having an intensity value distribution indicative of the distance of objects from the apparatus; and a controller, which varies at least one of the first and second modulation functions, responsive to the intensity value distribution of the image formed by the processor. The apparatus described in U.S. Pat. No. 6,057,909, may acquire depth or distance information simultaneously over the entire image.
U.S. Pat. No. 6,091,905, which is incorporated herein by reference, describes an apparatus for generating an image indicating distances to objects in a scene. The invention is comprised of a radiation source and modulator, telecentric optics for receiving and collimating the radiation reflected from the scene, a detector and a processor. The detector receives the collimated, reflected radiation and sends a signal to the processor. The processor forms an image having an intensity value distribution indicative of the distance of objects form the apparatus.
U.S. Pat. No. 6,100,517, which is incorporated herein by reference, describes an apparatus for generating an image indicating distances to points on objects in a scene, comprising: a modulated source of radiation, having a first modulation function, which directs radiation toward a scene such that a portion of the modulated radiation is reflected from the points and reaches the apparatus; an array detector which detects radiation from the scene, modulated by a second modulation function, each element of the array detector being associated with a point in the scene, each element of the array detector generating a signal, responsive to a part of the reflected radiation reaching the apparatus, the magnitude of particular element's signal being dependent on the distance of a point in the scene, associated with that element's signal; and a processor which forms an image, having an intensity value distribution indicative of the distance of each of the points in the scene from the apparatus, based on the magnitude of the signal associated with the point; wherein the first and second modulation functions comprise repetitive pulsed modulation functions which are different from each other.
Recent technological advances in industries such as gaming (e.g. immersive gaming) and multimedia communication have lead to a demand for efficient and inexpensive real-time 3D imaging methods, devices and system. 3D imaging systems for many consumer applications require the ability to generate a composite image containing both a two dimensional image displayable on a two dimensional screen and a depth map indicating the distance of specific points/objects on the two dimensional image.
Prior art imaging methods, devices and systems for generating multi-dimensional data sets representing a composite or multi-dimensional (e.g. three dimensional) image of a scene have used a combination of both a ranging sensor array and a visible light sensor array arranged on a common line of sight, for example by using a technology referred to as bore-sighting. Common line of sight arrangements are required in order to have proper correlation between two dimensional image data acquired by the visible light sensor array and ranging data acquired by the ranging sensor array. Common line of sight is achieved using optical arrangements including collimators, beam splitters and other optical elements. The use of common line of sight optical arrangements is, however, both cumbersome and costly. Complexities associated with producing common line of sight optical arrangements are a burden in the production of multi-dimensional imaging devices and systems.
There is a need in the field of imaging for improved methods, devices and systems for generating multi-dimensional image data sets representing a composite (i.e. two dimensional with depth map) images of a scene.