An analog-to-digital converter (ADC) samples an analog signal at a periodic interval and converts each sample to a digital representation. The accuracy of the conversion is limited by the fundamental linearity of the converter, which is predictable and repeatable, and by noise, which fluctuates randomly in time. The noise causes a statistical variation of the ADC output about its mean value for a given input. This statistical variation is typically assumed to have a Gaussian probability distribution with a given standard deviation. The probability distribution determines the signal-to-noise ratio (SNR) of the conversion.
In addition to noise, many ADC architectures are also susceptible to the problem of metastability. For example, a metastability event may occur in an ADC where an internal comparator has insufficient time to resolve an internal voltage comparison to a digital 1 or 0 decision. Such a metastability event generally occurs where a voltage difference applied to the comparator is close to zero. Comparator metastability events can cause internal malfunctions in ADC operation, resulting in very large ADC errors. A large ADC error means that the digital output signal produced by the ADC has a significantly different value than the analog input signal it is supposed to represent.
Although metastability events may be rare, and so not significantly impact the overall average SNR of the converter, they may make the statistical distribution of the converter error non-Gaussian. In particular, they may cause the probability of a very large converter error to be much greater than one would expect for a given SNR when a Gaussian error distribution is assumed. This can be a serious problem in many applications, particularly in test and measurement equipment.
The probability of comparator metastability generally decreases exponentially with a ratio of a conversion time period to a time constant of the comparator. The time constant of the comparator can often be decreased at the cost of increased power consumption, but there is a limit to how small it can be made for a given process technology. In addition, the conversion time period can be increased for a given sample rate by interleaving multiple ADCs in time, with each ADC operating on one out of multiple input samples. This can cause problems however, when mismatch between these ADCs create periodic errors in the ADC output, referred to as interleave spurs.