In the semiconductor industry, there is a continuing trend toward higher device densities. To achieve these high densities there has been and continues to be efforts toward scaling down device dimensions at submicron levels on semiconductor wafers. In order to accomplish such high device packing density, smaller and smaller feature sizes are required. This may include the width and spacing of interconnecting lines and the surface geometry such as corners and edges of various features. The dimensions of and between such small features are referred to the critical dimension (CD). Reducing CDs and reproducing CDs that are more accurate facilitates achieving higher device densities.
High-resolution lithographic processes are used to achieve small features. In general, lithography refers to processes for pattern transfer between various media. In lithography for integrated circuit fabrication, a silicon slice, the wafer, is coated uniformly with a radiation-sensitive film, the photoresist. The film is selectively exposed with radiation (e.g., optical light, x-ray, and electron beam) through an intervening master template (e.g., mask, and reticule) forming a particular pattern (e.g., patterned resist). Dependent upon coating type, exposed areas of the coating become either more or less soluble than unexposed areas in a particular solvent developer. Areas that are more soluble are removed with the developer in a developing step, while less soluble areas remain on the silicon wafer to form a patterned coating. The pattern corresponds to either the image of the mask or its negative. The patterned resist is used in further processing of the silicon wafer.
Efforts to reduce CDs have included implementing various techniques in connection with the lithographic process, such as reducing exposure radiation wavelength (e.g., from 436 nm mercury G-line to 365 nm I-line to 248 nm DUV (deep ultra-violet) to 193 nm excimer laser), improving optical design, and utilizing metrology techniques (e.g., scatterometry and scanning electron microscope (SEM)).
The minimum feature that can be printed with an optical lithography system is determined by the following Rayleigh equation:
      CD    =                            k          1                ⁢        λ            NA        ,
where k1 is the resolution factor, λ is the wavelength of the exposing radiation, and NA is the numerical aperture. The NA is defined as a lens' ability to gather diffracted light and resolve fine details onto a wafer, and is actually determined by the acceptance angle of the lens and the index of refraction of the medium surrounding the lens (e.g., air).
Immersion technology offers improved resolution enhancement and higher NAs over conventional projection lithography. In immersion, lithography facilitates further reduction of CDs by employing an immersion fluid in the gap between the final optical component and the wafer being exposed to further focus resolution. The liquid in the gap between the wafer substrate and the final optical component (e.g., lens and scanner) has a refractive index, n, that is greater than the refractive index of air (which is slightly >1), where the refractive index is defined as the ratio of the speed of light in a vacuum to the speed of light in a particular medium. Imaging through the liquid adds the factor of n to the denominator of the Rayleigh equation, resulting in,
  CD  =                              k          1                ⁢        λ            nNA        .  
Utilizing an immersion medium with a refractive index greater than that of air not only increases the NA, but also can decrease the effective wavelength of an exposure radiation propagating within the immersion medium without changing exposure radiation, lasers, lens materials, etc. When a base developer is used as an immersion lithography fluid, the resist is patterned when removing the immersion lithography fluid from the wafer after irradiation. Consequently, the requirement of a separate development step is eliminated, thereby simplifying the lithographic process.
Water is the most common liquid employed in current immersion lithography systems. This is because water has an index of refraction of about 1.47, absorption of less than about 5% at working distances of up to six millimeters, is compatible with most photoresists and lens, and in an ultra-pure form, is non-contaminating. Specifically, the water employed for most immersion applications is double deionized, distilled, and degassed.
However, while the immersion is promising, there are a number of concerns associated with implementing immersion lithography that require solutions before the technology gains wide acceptance. Photoresist material dissolved in the immersion medium can change optical properties of the immersion medium (e.g., refractive index and lithographic constant), thereby impacting efficiency of immersion lithography systems and elevating costs associated with expensive immersion mediums. Moreover, there is a tendency for the immersion liquid to develop micro-bubbles, which ruins the benefits offered by the technology. Maintaining a consistent bubble free liquid between the lens and the wafer is very difficult. Polarization of the lens is also a significant concern. Additionally, immersion lithography typically requires large, expensive lenses.
Thus, there exists a need for an improved architecture for at least the monitor and/or control of immersion medium characteristics in real time.