This invention relates to the design of integrated circuits, and more specifically, to methods and systems for performing hierarchical variation analysis of integrated circuits through use of the unique new technology described here.
The age of information and electronic commerce has been made possible by the development of electronic circuits and their miniaturization through integrated circuit technology. Integrated circuits are sometimes referred to as “chips” or “ICs.” To meet the challenges of building more complex and higher performance integrated circuits, various specialized software tools are used. These tools exist in one or more areas commonly referred to as; as; computer aided design (CAD), computer aided engineering (CAE), electronic design automation (EDA), or technology computer aided design (TCAD). This document focuses on the latter two, EDA and TCAD. Typically as technology advances such that more sophisticated technology modeling is required within EDA flows, TCAD tools and methods are routinely adopted and recast for EDA application.
There is a constant need to improve these tools for each technology generation in order to address the requirements for; higher integration, greater functional capability and complexity, smaller chip area, and better performance of integrated circuits.
It has been particularly important to improve tools in the areas design for manufacturability (DFM) and variation-aware modeling (VAM) due to the many inherent variations in play for nanoscale process technologies.
Despite vast increases in circuit and process technology complexity, market forces demand efficient robust design and well-controlled and quick time-to-market. Therefore, inherent manufacturability, yield, and reliability must still be maintained or improved and specialized DFM tools are now commonly used in integrated circuit design flows for this purpose.
These DFM tools are typically and primarily driven by the photolithographic patterning and pattern transfer processes which enable miniaturization for each successive process generation but do so with limited patterning fidelity. As such, these tools have primarily been focused on optimizing the geometric shape fidelity of the features defining the physical circuit in the lateral planes of the manufactured chip though the addition of artificial resolution enhancement technologies (RETs) such as optical proximity correction (OPC) to improve shape fidelity and subresolution assist features (SRAFs) to improve the manufacturing process window over which acceptable fidelity is achieved. Other artificial features are added to compensate for manufacturing limitations including dummy fill features to improve planarity. These highly planar layers are required for nanoscale lithography and great effort is expended to add dummy area fill features for this purpose. However, the addition of these RETs and dummy fill features is not a manufacturing pancea. Typically tradeoffs are made which can impact performance or yield and optimization of these is non-intuitive requiring highly advanced tools to aid in doing so. For example, strong OPC can impact the manufacturing process window over which acceptable performance yield is achieved. So one must be judicious.
Likewise, dummy fill typically has the unavoidable consequence of greater parasitic coupling of signals to the dummy feature thereby impacting logic and clock signals. So careful tradeoffs must be made between manufacturability and performance/yield.
These inherent nanoscale technology effects as well as many others not discussed mandate extensive analysis of variations, their consequences, and means to mitigate them so as to minimize their impact on the functional performance of the IC over an acceptable margin.
Ultimately, the function of ICs is graded based upon electrical function metrics. When DFM tools are primarily driven by electrical function metrics rather than geometrical shape, they are often referred to as Electrical Design For Manufacturing (eDFM) tools—so circuit simulation is a key component of eDFM. The so-called “parasitic” extractor is likewise a key component as it is used to generate from the design layout or from the simulated “silicon image” of it, the physical circuit in terms of elementary circuit elements; resistance (R), inductance (L), and capacitance (C), the three elements collectively, RLC or impedance.
However, eDFM tools are typically limited by circuit simulator capacity and speed due to the large number of non-ideal physical circuit features which must be resolved and considered in such a methodology for accurate results.
As IC interconnect, via, and device critical dimensions are reduced and/or system frequency is increased many additional so-called “parasitic” RLC effects must be considered. These parasitic effects can cause unwanted cross-coupling of signals, IR drop impacting signal, supply, or ground voltage, and noise in clock, signal, and power distribution networks. These effects can be significant and should properly be accounted for in simulation of the IC. If not, there is increased risk that the IC will have functional failure or performance limitations following fabrication and incorporation into an end product.
The number of parasitic effects has been increasing for each process generation and with increases in circuit size, complexity, and function simulating the impact of these parasitics can be an enormous challenge requiring very large computing resources and time.
Additionally, myriad variations impact integrated circuit performance and reliability and should be accounted for in any robust integrated circuit design flow. For nanoscale IC designs in general, variations occur in many forms and their impact on the IC function may be very non-intuitive. These variations occur even for a well-controlled fabrication processes simply due to fundamental physical limits dictating the resolution, planarity, or homogeneity of the physical circuit elements. Lithographically patterned features, even with the best RET techniques can be far from ideal resulting in geometrical feature distortions and variations in the distributed resistance, capacitance, and resistance of an interconnect or power network. Inter-layer dialectric and metal layer thickness variations due to imperfect planarization processes also occur similarly impacting the interconnect and power networks. Additionally, intrinsic variations due to fundamental material properties include statistical material inhomogneities of the various layers comprising the integrated circuit and may impact interconnect as well as active devices such as for example through statistical dopant fluctuations. Accounting for these variations is now required for robust design. Variations may also be exhibited in either/both intra-die or inter-die manner, meaning that the variations may be the same for each die or different for each die across the wafer of dice. An example of the former is due to pattern-dependent proximity effects such as from the photolithography process wherein constructive and destructive interference effects occurring during partially-coherent imaging causes critical feature shape variations. Examples of the latter include deposition or etch chamber effects impacting the across-wafer uniformity of a given process. These effects may also be time-varying as well. These variations increasingly must be accounted for during a well designed to reduce functional sensitivity to inherent variations in the fabrication of the IC.
Decreasing feature sizes, increased complexity, and greater function also means higher current densities within the clock, signal, and power distribution networks of ICs. Consequently, electro-thermal effects are similarly important. These thermal factors cause an increase in metal resistance impacting performance of an IC and also cause a reduction in reliability due to thermal and electromigration failure. These variations as well as other variations are increasingly important and may impact nanoscale IC design reliability and performance when the eventual product is used in the field. For example, the duty cycle and operating conditions of a design may severely impact the thermal distribution across the chip further stressing marginal electromigration-sensitive features and thereby impacting reliability or local resistance thereby reducing performance.
All of these variations have a far greater impact than ever before especially for low-voltage nanoscale designs or highly-integrated mixed-signal ICs. As these circuits get more complex, there are not only far more variations in play but the impact of these variations can be very difficult to ascertain due to the design size as well as the size and even uncertainty of the domain in which variations are in play. Simulating these variations is increasingly critical however the number of variations and the range in which one performs simulation is a significant problem due to not only the number of variations but also due to the size of the circuits which are impacted by these variations.
In circuit simulation, “corner-modeling” is a technique that has traditionally been used to verify performance extents of an integrated circuit design in terms of both device parameters impacted by manufacturing and chip operating specifications. The idea has been to reduce the simulation domain and make tractable the circuit simulations which are run in order to qualify a circuit's inherent functional performance in terms of the variations of key parameter or specification values. Now however, with nanoscale integrated circuit design and performance increasingly impacted by a large number of these variations, variations which are far less intuitive to comprehend than in prior generations, the problem becomes very difficult. The impact of these variations can be nonintuitive, non-linear, and can be statistically correlated or uncorrelated making it difficult to simplify the problem. This makes traditional corner modeling very difficult to validate over the large number of corners that would be required mandating Monte Carlo (MC) methods.
Companies that design nanoscale integrated circuits take on an increasing amount of risk with all of these additional variations and due to their inability to accurately model the impact of these variations. These effects are complex, increasingly intertwined, and can significantly impact the function, yield, and reliability of these ICs and the product they are used in. As a result, designers are typically overly conservative and employ excessive “worst-casing” and “guard-banding” resulting in designs that operate far beneath theoretical optimal operating capabilities potentially achievable otherwise. Typically, at least one full process generation is lost due to such margining. Even still, the risk is high and an increasing number of chip designs are go to fabrication without sufficient simulation over the full variation domain the designer would otherwise desire due to the large computational burden of doing so.
Further, while the cost of a major chip project can easily run into many tens or even hundreds of millions of dollars, the risks inherent in nanoscale design and manufacturing is increasing at a significant pace due to inadequate verification coverage over the variational domain.
Although there is great recognition of the problem, especially given that the cost of a IC failure is very large, designers have not had means to mitigate this risk at the scale required due to simulation tool limitations. It has not been possible to perform accurate simulations which capture the impact of these variations due to tools which must compromise in order to achieve capacity or speed requirements. The simulation network is typically too large for most simulators so a number of compromises are made. The most common compromise is hidden reduction of a circuit's complexity by the simulation tool in order to meet capacity or simulation speed requirements. The distributed parasitic RLC interconnect and power network is reduced to a simpler network comprised of fewer elements. However this also is likely to reduce the manufacturing details out of the simulation. The end result is that circuit simulation is no longer an accurate deterministic model of the original designer circuit, it may not be one-to-one mapped to the physical circuit layout, and variations of concern particularly for eDFM are reduced out of the solution altogether or not of value due to lack of deterministic accuracy.
Therefore, what is needed is a system and technique to enable accurate simulation of circuits impacted by variations across a representative range of manufacturing and operating parameter conditions. Further, what is also needed are tools capable of simulating the impact of nanoscale effects on large-scale circuit blocks or full chips without compromise in accuracy. These tools must also provide practical problem solutions within a reasonable timeframe. The following summary describes the necessity for a true hierarchical variation simulation method using reusable deterministic hierarchical models.