Field of the Invention
The disclosed invention relates generally to failure analysis, and more particularly, but not by way of limitation, relating to a massive multi-dimensional failure analysis of a circuit.
Description of the Related Art
An important part of manufacturing and troubleshooting a circuit is locating and correcting a malfunctioning circuit or area of a circuit. Such failure analysis is important for reliability of a circuit, but can be very time consuming and expensive. Fault analysis of highly-integrated semiconductor circuits has become an important tool in the optimization of product quality.
Advanced warning of a system or component failure is desirable, including competitiveness through lower cost with higher reliability and minimized downtime. Such failures may present safety or maintenance concerns that could result in loss of market share an increased costs in the future.
As memory array architectures increase requirements for density and speed, accurately estimating the cell failure rate of a design becomes critical. For example, a finite number of redundant rows and/or columns is available to replace those containing defective cells, a number of failed cells above this level of redundancy will yield a defective device. The number of defective devices, or device yield is then directly related to the cell failure rate. Additionally, the larger arrays have increasingly stringent failure rate control requirements.
The Monte-Carlo analysis of the defective cells requires a very large number of iterations, due to the random sampling of the entire probability space of the independent variables that are treated in the analysis. For example, The failure rate Pf can be estimated by a brute-force Monte Carlo analysis. When a brute-force Monte Carlo analysis is applied to estimate the failure rate Pf that is extremely small (e.g., 10−8˜10−6), most random samples drawn from the PDF f(x) do not fall into the failure region Ω. Hence, a large number of (e.g., 107˜109) samples are needed to accurately estimate the failure rate Pf.
As the cell failure rate decreases, the number of samples and iterations required for accurate analysis becomes increasingly large, because of the relatively sparse distribution of samples in the distribution that correspond to failed cells. The effect of circuit changes on cell read and writeability, as well as minimum read and write cycle times and margins are difficult to estimate at very low failure rate levels, so such low failure rates cause further complications for adjusting designs to achieve the best result.
Techniques other than Monte-Carlo analysis have been implemented for estimating cell failure rates, each with related drawbacks. Sensitivity analysis is a well-known technique in which the gradients of the various independent variables are used to determine the bounds of the non-failure confidence region. However, accurate estimates of the failure rate are not typically produced by sensitivity analysis, as sensitivity analysis by its very nature cannot determine the exact overlapping impact of all independent variables on the cell failure rate at once. Another technique that can accurately estimate the failure rate is the grid analysis approach, in which the grid size can be made arbitrarily small. However, the number of simulations increases exponentially with the number of independent variables and typically a large amount of custom coded program control (scripting) must be employed to direct the analysis.
Accurately estimating the rare failure rates for nanoscale circuit blocks (e.g., SRAM (Static Random Access Memory), DFF, etc.) is an even more challenging task, especially when the variation space is high-dimensional. The random variations in process parameters have emerged as a major design challenge in circuit design in the nanometer scale. For example, in an SRAM cell, a mismatch in the strength between the neighboring transistors, caused by intra-die variations, can result in the failure of the cell.
The failure analysis in the single-dimensional failure analysis techniques is limited and time consuming.
Therefore, the state of the art solution for statistical circuit analysis and yield analysis does not scale well for higher number of dimensions and variation sources.
Therefore, it is desirable to provide a failure analysis of a circuit that cost effective and efficient.
It is also desirable to have an effective analysis of yield and reliability for devices and circuits using advanced statistical analysis techniques with many parametric variables at higher dimensions.