This application is based on Japanese Patent Application No. 2000-97241 filed on Mar. 31, 2000, the contents of which are hereby incorporated by reference.
1. Field of the Invention
The present invention relates to an apparatus for measuring a three-dimensional shape of an object by irradiating light such as a slit light to the object, comprising obtaining both of measured data to be used for three-dimensional measurement of the object and two-dimensional image data of the object.
2. Description of the Prior Art
A non-contact type three-dimensional measuring apparatus is often utilized for inputting data to CG system or CAD system, anthropometry, visual recognition of robots and so on since the non-contact type three-dimensional measuring apparatus can conduct measurements more rapidly than a contact-type measuring apparatus does.
There is known, as the method of the non-contact measurement of an object, a method wherein an object is measured by projecting a specific detection light to an object and receiving reflected light based on the triangulation method. For example, a laser beam is projected on an object from a light source such as a semiconductor laser, and bright spot occurring thereby is captured by a camera from an angle different from that of the light source. Thus, a three-dimensional location is determined by connecting positions of the light source, camera and bright spot to form a triangle.
Spotlight projection method and slit light projection method are known as one of such methods. In the spotlight projection method, a spotlight is projected on an object and the projected spot light is optically and two-dimensionally scanned. In the slit light projection method, a slit light is projected on an object, and the projected slit light is optically and single-dimensionally scanned. The slit light projection method is called also as a light-section method. In the spotlight projection method, there is obtained a spot image whose sectional surface on a imaging surface is a punctiform image. In turn, in the slit light projection method, there is obtained a slit image whose sectional surface is linear is obtained. Three-dimensional image is obtained as a collection of pixels indicating three-dimensional positions of a plurality of parts on the object.
Turning to the drawings, FIGS. 26A to 26D generally illustrate a scheme of the slit light projection method, and FIGS. 27A and 27B generally illustrate the principle of measurement which employs the slit light projection.
A measurement slit light U in the form of a strip having a thin sectional surface is irradiated on an object Q that is an object for measurement. Reflected light is then made incident to, for example, an imaging surface S of a two-dimensional light receiving element (FIG. 26A). If the irradiated part of the object Q is flat, the obtained image (slit image) is linear (FIG. 26B). If the irradiated part is rough, obtained image (slit image) is bent or step-wise (FIG. 26C). Thus, distance between the measuring apparatus and the object Q influences a position of the incidence of the reflected light on the imaging surface S (FIG. 26D). Sampling of three-dimensional position of an object is realized by deflecting the measurement slit light U in the widthwise direction and scanning part of a surface of the object which is visible from the light-receiving side. Number of sampling points depends on number of pixels of an image sensor.
In FIGS. 27A and 27B, a light projection system and a light receiving system are so arranged that a base line AS connecting a start point A of light projection with an imaging surface S of the light receiving system is perpendicular with respect to a light receiving axis. The light receiving axis is perpendicular to the imaging surface S, and an intersection S0 of the light receiving axis and the imaging surface S is set as an original position of three-dimensional rectangular coordinate system. Z axis is the light receiving axis, Y axis is the baseline AS and X axis is the longitudinal direction of the slit light.
HHxe2x80x2 denotes distance between a front principal point H and a rear principal point Hxe2x80x2 of a light receiving lens, and b denotes a distance between the point S0 and the rear principal point Hxe2x80x2.
The distance b is a so-called image distance. The image distance b is a distance from the rear principal point Hxe2x80x2 of the lens to the imaging surface S when an image of an object at a finite position is formed on the imaging surface S. The image distance b depends on relationship between a focal length of the light receiving lens and a feed amount of the lens for focusing.
In the case of setting as projection angle xcex8a the angle at which the light projection axis intersects a light projection reference surface (a light projection surface parallel to the light receiving axis) when a point (X, Y, Z) on the object is irradiated with the measurement slit light U, and setting as light receiving angle xcex8p the angle at which a line connecting the point P with the front principal point H intersects a plane (light receiving axis surface) containing the light receiving axis, coordinate Z of the point P is represented by the following equation:   L  =            L1      +      L2        ⁢          
        =                                        Z            ⁢                          xe2x80x83                        ⁢            tan            ⁢                          xe2x80x83                        ⁢            θ            ⁢                          xe2x80x83                        ⁢            a                    +                                    (                              Z                -                                  HH                  xe2x80x2                                -                b                            )                        ⁢            tan            ⁢                          xe2x80x83                        ⁢            θ            ⁢                          xe2x80x83                        ⁢                          p              .                                      ⁢                  
                ∴        Z            =                        (                      L            +                                          (                                                      HH                    xe2x80x2                                    +                  b                                )                            ⁢              tan              ⁢                              xe2x80x83                            ⁢              θ              ⁢                              xe2x80x83                            ⁢              p                                }                /                  {                                    tan              ⁢                              xe2x80x83                            ⁢              θ              ⁢                              xe2x80x83                            ⁢              a                        +                          tan              ⁢                              xe2x80x83                            ⁢              θ              ⁢                              xe2x80x83                            ⁢              p                                }                    
When setting the light receiving position of the point P as Pxe2x80x2 (xp, yp, 0) (see FIG. 27A), and imaging magnification of the light receiving lens as coordinates of the point P are:
X=xp/xcex2
Y=yp/xcex2
In the above equations, the baseline L is determined by the locations of the light projection system and the light receiving system and, therefore, has a predetermined value. The light receiving angle xcex8p can be calculated from the relationship of tanxcex8p=b/yp. The imaging magnification of the light receiving lens is calculated from xcex2=xe2x88x92b/(Zxe2x88x92HHxe2x80x2xe2x88x92b).
Thus, after obtaining the distance between the principal points HHxe2x80x2, the image distance b and the projection angle xcex8a, it is possible to determine the three-dimensional position of the point P by measuring the position Pxe2x80x2 (xp, yp) on the imaging surface S. The distance HHxe2x80x2 between the front principal point and the rear principal point and the image distance b are determined by relative positional relationship of the lenses comprised in the light receiving system.
In the three-dimensional measuring apparatus comprising a light receiving lens system having a fixed focal length and fixed focus, i.e., in the three-dimensional apparatus which can achieve measurement only when the light receiving lens and an object have a sole predetermined distance relationship, distances between the lenses and the imaging surface S are fixed. Fixed values of the distance HHxe2x80x2 and the image distance b can be input into such three-dimensional measuring apparatus in advance of the measurement.
On the other hand, in a three-dimensional measuring apparatus capable of changing distance relationship between a light receiving system and an object, it is necessary to move a part or whole part of lenses comprised in the light receiving system about its light receiving axis for focusing. Further, a three-dimensional measuring apparatus capable of varying an angle of view of a light receiving system typically has a zooming mechanism. According to such zooming mechanism, it is possible to change focal length of a lens by moving a part or whole part of the lenses about a light receiving axis.
In the above mentioned cases, it is possible to detect relative positions of lenses and to obtain the distance HHxe2x80x2 between the front principal point and the rear principal point and the image distance b from a table memorized in advance of measurement by a potentiometer (position sensor) for detecting positions of the lenses or, if the automatic lens drive is employed, an encoder associated with a motor in order to detect.
The light projection angle xcex8a is changeable depending on a deflected angle of the measurement slit light U. In the case of using a three-dimensional measuring apparatus comprising a galvanomirror for deflection, the light projection angle xcex8a is calculated by employing a known method of recognizing the deflected angle of the measurement slit light U in a capturing process by synchronously controlling a imaging timing of light receiving element, an angle for starting rotation of the galvanomirror and a rotational angular speed of the galvanomirror.
In the case of three-dimensional measurement based on the above mentioned principles, a user, as a measurer, decides position and deflection of the three-dimensional measuring apparatus and changing an angle of view of the light receiving lens as required in order to set a capturing area (scanning area) of the object Q.
The angle of view can be changed by using a zoom lens as the light receiving lens, or by exchanging the light receiving lens with a lens having a different focal length.
To facilitate the above-described framing operation, a monitor image of the object Q obtained by photographing the object Q at the same angle of view as that of the scanning area is useful.
For example, in the three-dimensional CG (three-dimensional computer graphics), two-dimensional image data indicating color information of the object Q are required along with three-dimensional data indicating shape of the object Q in many cases.
U.S. Pat. No. 6,141,105 has proposed a method wherein a color separation into a plurality of optical paths using a beam splitter is performed by a light receiving optical system to obtain both of the three-dimensional data and two-dimensional image data.
FIG. 28 is a schematic view showing a light receiving optical system using a beam splitter 70 for color separation. FIG. 29 is a graph showing a light receiving wavelength of a light receiving element 71. FIG. 30 is a graph showing a light receiving wavelength of a light receiving element 72 for monitoring.
The beam splitter 70 comprises a color separation film (dichroic mirror) 701, a pair of prisms 702 and 703 sandwiching the color separation film 701, a visible ray cutting filter 705 placed at a front side of the light receiving element 71, an infrared ray cutting filter 704 placed at an emission surface of the prism 703 and low pass filters 707 and 708.
Light (light beams) UC incident from a light receiving lens then enters into the color separation filter 701 through the low pass filter 707 and the prism 702.
Light U0 in an oscillation band of a semiconductor laser is reflected at the color separation film 701 and then subjected to a total reflection at the incident surface of the prism 702, followed by projection thereof from the emission surface to the light receiving element 71. Among the light U0 projected from the prism 702, light beams that transmit the infrared ray cutting filter 704 and the visible ray cutting filter 705 are then received by the light receiving element 71. In turn, light C0 that transmits the color separation filter 701 is then projected to the light receiving element 72 through the prism 703 from the emission side thereof. Among the light C0 projected from the prism 703, light beams that transmit the infrared cutting filter 706 and the low pass filter 708 are received by the light receiving element 72.
Referring to FIG. 29, the color separation filter 701 reflects a light having a relatively wide bandwidth that contains a wavelength of a slit light (xcex: 690 nm) as indicated by the broken line. Thus, wavelength selectivity of the color separation filter 701 is not satisfactory from the viewpoint of selectivity for making only the slit light incident to the light receiving element 71. However, in the beam splitter 70, a light ultimately enters into the light receiving element 71 is the light having a narrow bandwidth as indicated by virgules in FIG. 29 due to the infrared ray cutting filter 704 having the characteristics indicated by chain line and the visible ray cutting filter 705 having the characteristics indicated by continuous line.
In turn, only visible light enters into the light receiving element 72 since the infrared rays transmitting the color separation film 701 having the characteristics indicated by broken line are cut off by the infrared ray cutting filter 708 having the characteristics indicated by continuous line in FIG. 30.
In the conventional method of performing color separation using a plurality of light receiving elements and a beam splitter, relative positional relationship between the light receiving elements is so important that adjustment for the relative positions among the light receiving elements requires user""s cautiousness and, thus, the adjustment operation is troublesome. Further, it is necessary to prevent the relative positions from changes due to vibration or shock even after the adjustment, so that the production and adjustment entails troubles.
In addition, a high-precision beam splitter is expensive. Production cost of a three-dimensional measuring apparatus having such high-precision beam splitter is increased in combination with the need for a plurality of light receiving elements and control drivers. Color CCD is typically used as light receiving element in the case of obtaining a color image by using the light receiving element; however, as shown in FIG. 30, a wavelength of the light enters into the light receiving element 72 is limited to 650 nm or less in the visible area thereby deteriorating color reproduction of red.
To overcome the above problems, there has been proposed a method wherein an RGB rotational filter having a transmittance shown in FIGS. 14B, 14C and 14D is used as a light receiving optical system for obtaining both of three-dimensional data and two-dimensional image data. According to the method, it is possible to reduce the production cost and obtain images free from positional differences of pixels.
However, in the above method, SIN is deteriorated since the RGB rotational filter is used for obtaining two-dimensional color image data and also for obtaining three-dimensional data.
For example, in the case of using a red color laser as a light source for emitting a light from a light projection system, projected light is reflected at an object and transmits an R filter to reach the light receiving element as mentioned above. The three-dimensional data are calculated from output signals of the light receiving element, and components of the output signals other than the reflected light of the projected light become noise which must be eliminated. However, in this method, the R filter used for obtaining the two-dimensional image data has a transmittance shown in FIG. 14B, and, in the case where wavelength of a semiconductor laser beam is 690 nm, the R filter transmits unnecessary light beams to thereby create an error factor in calculating the three-dimensional data.
An object of the present invention is to provide a three-dimensional shape measuring apparatus which can obtain both of three-dimensional data and two-dimensional image data in a three-dimensional measurement, wherein the three-dimensional data and the two-dimensional image data have no positional difference and scarcely have noises created by unnecessary wavelengths.
According to one aspect of the present invention, a three-dimensional shape measuring apparatus comprises a light projector for projecting light to an object, a light receiving element, a light receiving optical system for leading light projected on and then reflected by the object to the light receiving element, the light receiving optical system including a first optical filter which transmits only light having substantially the same range of wavelengths as that of the light projected from the light projector, at least one second optical filter which transmits light having a different range of wavelengths as that of the first optical filter and a filter selector which selectively locates one of the first optical filter or the second filter or filters at an optical path of the light receiving optical system, and a calculator for obtaining measured data for three-dimensional shape measurement based on signals output from the light receiving element.
Preferably, the second optical filter may include three optical filters for transmitting wavelengths of colors of red, green and blue in order to obtain a two-dimensional image of the object.
The second optical filter may include three optical filters for transmitting wavelengths of stimuli of X, Y and Z.
These and other objects and characteristics of the present invention will hereinafter be described more in detail by the drawings and preferred embodiments.