A variety of metrology tools have been used to measure process performance on both monitor and product wafers. A general measurement may be taken at the wafer center. Typically, a measurement profile may be performed at a number of wafer sites to establish uniformity as a function of radial distribution or angle (theta). Stage repeatability and accuracy is a primary source of measurement drift. For example, a stage offset can impair the uniformity of film thickness in important layers, such as the gate oxide layer, reducing product yield. It has been determined that a stage offset of less than 2 mm can influence routine statistical process control (SPC) monitor status and unnecessarily impact tool availability. Traditional stage calibration sensitivity to stage drift is relatively weak.
Fourier transform infrared (FTIR) spectroscopy is a representative wafer metrology technique in that numerous measurement applications can be performed, such as oxide dopant concentration (Boron, Phosphorous, Fluorine), film thickness (epitaxial silicon, oxide, nitride, photoresist), and bare silicon impurity concentrations (interstitial oxygen, substitutional carbon). FTIR is sensitive to molecular absorption of light in the 2–40 μm wavelength range. FTIR measurements can be performed using either transmitted or reflected mode. For example, transmission methods are typically used for calibrating position of a stage holding a bare silicon wafer. The wafer stage accommodates unobstructed illumination above and below the wafer using two vacuum “finger” supports located at a radius of 95 mm and theta 90 and 270 degrees.
The FTIR stage is calibrated interactively by visually aligning a HeNe laser image to center and edge scribe lines on a bare wafer. Another demonstrated calibration technique is based upon relative infrared signal strength for a wafer with pre-drilled aperture holes. Both methods are insensitive, and the time-intensive results show limited accuracy.
An improved wafer stage calibration method is desired.