In the industries utilizing high precision microstructure processing such as IC industry, semiconductor industry, LCD industry, automation industry, electro-optical measurement industry, and so on, a three-dimensional surface profile measurement is always the key procedure required for ensuring the consistency of their manufacturing process. Among all those currently available surface profile measurement techniques, the most commonly used are those optical or electro-optical measuring methods since they can perform a surface profile measuring process in an non-contact manner for accurately inspecting any microstructure formed on the surface of measured object with regard to its profile, thickness or size. There are many optical measurement techniques currently available, including confocal microscopy, phase-shifting interferometry, and white-light vertical scanning interferometry, etc. They are designed for different measurement environments and for different applications.
Conventionally, confocal microscopy is an optical imaging technique used to reconstruct three-dimensional images of a measured object by using a spatial pinhole to eliminate out-of-focus light or flare in the measured object that are thicker than the focal plane. As only one point is illuminated at a time in confocal microscopy, 2D or 3D imaging of the measured object requires scanning over a regular raster in the specimen that usually includes a fast horizontal scan in conjunction with a slower vertical scan for generating optical sections of different depths relating to the measured object. Thereafter, by the use of computers for performing a reconstructing process upon the obtained optical sections of different depths, an image containing information relating to the three-dimensional profile of the measured object can be obtained.
There are already many studies relating to the use of confocal microscopy, for instance, the confocal wafer inspection system disclosed in U.S. Pat. No. 6,934,019. As shown in FIG. 1, a confocal imaging optical setup is an optical setup for imaging a point of light source “S” 11 through a lens 12 into a sharply focused second point “S1” 13 and then reversing the image from the second point 13 onto a splitter 14 that reflects the image onto a tiny spatial filter “S2” 15. Field extension can be obtained by stretching the chromatic aberration of the focusing lens 12 having a setup with an infinity of purely confocal systems, one for each wavelength with a different sharply focused point 13a, 13b, 13c and so on. For example, the first color has a first sharply focused point 13a, the second color has second sharply focused point 13b and the third color has a third sharply focused point 13c. Moreover, only one color arrives to the filter 15, according to height of the surface, which matches the focus length. If the surface height matches the first sharply focused point 13a, then the first color is detected, if the surface height matches the second sharply focused point 13b the second color is detected and so on. By placing an inspected wafer on a movable platform or enabling the whole confocal imaging optical setup to be movable in relative to the inspected wafer, height information relating to the three-dimensional profile of the inspected wafer can be obtained. Since such an optical setup of point light field is absolutely blind for all the space except for the sharply focused second point 13, one inspection can only inspect one single dot on the wafer surface that the process for inspection the whole surface of the inspected wafer can be very time consuming and thus might cause the production yield of wafer manufacturing to drop significantly. It is noted that since only one color can arrive to the filter 15 for imaging at each inspection, the obtained image is monochrome that can simply be detected and analyzed by the use of spectrometer.
There is another confocal microscope disclosed in U.S. Pat. No. 5,785,651. In the operation of a confocal microscope disclosed in this US patent, the polychromatic light from a light source is projected onto a achromatic collimator lens where it is collimated without undue chromatic aberration and then be projected onto a Fresnel optical element so as to form a spectral spread light field, where the focal point of the projected light varies according to wavelength, to be used for inspecting the surface profile of a measured object. Similarly, since the polychromatic light is modulated into a spectral spread light field, where the focal point of the projected light varies according to wavelength, one inspection using the aforesaid confocal microscope can only inspect one single dot on the measured object that the process for inspection the whole surface of the measured object can be very time consuming and thus might cause the production efficiency to drop significantly, and also since only one color can arrive to the imaging device for imaging at each inspection, the obtained image is monochrome that can simply be detected and analyzed by the use of spectrometer.
Yet, there is another confocal distance sensor disclosed in U.S. Pat. Pub. No. 2004/0109170. The aforesaid confocal distance sensor is for rapid optical distance measurement based on the confocal imaging principle, by which different spectral components of the illuminating light are focused at different distances from the optical imaging system due to a chromatic aberration of the optical imaging system. Nevertheless, although the aforesaid confocal distance sensor can be adapted for inspecting the surface profile of a measured object, it can also only capable of inspection one single dot on a measured object during one inspection.
Please refer to FIG. 1B, which is a schematic diagram showing a conventional optical device using a diffractive optical element (DOE) for generating a liner linear dispersion field. In FIG. 1B, the optical device 16 projects a broadband light emitted from a broadband light source 160 to a diffractive optical element (DOE) 165 through a semi-cylindrical lens 161, a slit 162, a collimation lens 163 and a beam splitter 164 so as to form a linear dispersion field. Since the numerical aperture (NA) of the linear dispersion field generated from the DOE 165 is comparatively lower, the linear light field is required to be collimated by the use of another collimation lens 166 before it is guided to an objective lens 167 for projecting the same onto a measured object 1000, where it is reflected back to the objective lens 167 and then being guided to another slit 169 through the beam splitter 164 and a conjugate lens 168. After passing through the slit 169, the resulting light field is modulated by a lens 170 and a light grid 171 and then being detected by an image sensor 172 to be used for generate an image of the measured object accordingly. Although the required liner linear dispersion field can be generated by the DOE, there are a considerable amount of components in the aforesaid optical device that not only it is very complex and bulky in view of system design, but also it may be very expensive to build.