Almost all of today's computer chips are built on silicon wafers. Semiconductor manufacturers produce many kinds of ICs or chips. The precise process followed to make a chip varies according to the type of chip and the manufacturing company. However most wafer processing involves the same basic steps.
FIG. 1 illustrates the exposure phase of the photolithography process that occurs during fabrication of an integrated circuit (or “chip”). Photolithography is used to create layers of circuit patterns on the chip. First, the wafer is coated with a light-sensitive material called photoresist. Light is shown through patterned plate called a reticle (or “mask”) to expose the resist, much the same way as film is exposed to light to form a photographic image. Following the lithographic process, the wafer is etched so that materials are removed, thus forming a three-dimensional pattern on the surface of the chip. Through additional lithographic and etching steps, subsequent layers of various patterned materials are built up on the wafer to form the multiple layers of circuit patterns on the chip.
Once wafer processing is complete, each chip (or die) on the wafer is tested for electrical performance, cut apart with wafer saws, and put into individual protective packages. Once packaged, chips are tested again to make sure they function properly before being shipped to distributors or placed in electronic products
It sometimes occurs that a single pattern is larger than the field size of the photolithographic stepper. When this occurs, the pattern is subdivided and the patterns are “stitched” to form the original pattern. If there are non-uniformities existing across the exposure field of the photolithographic stepper, then these non-uniformities may show up as discontinuities across the stitch.