The analog-to-digital converter (ADC) is one of the most frequently used analog mixed signal (AMS) components today and is the primary limiter of a system on chip (SoC) due to the ADC's analog characteristics. AMS components are more prone to variations in process voltage temperature (PVT) than digital components, and performance can change from silicon to silicon. An ADC that outputs digital codes nonlinearly as voltage increases may be imprecise. Testing ADCs is therefore vital to ensuring quality control and minimizing defective parts in production.
Automated test equipment (ATE) has traditionally been used in semiconductor manufacturing to test an SoC after the chip is fabricated. The goal of any ATE is to find faults or manufacturing defects of electrical components of the SoC. Testing the analog components of every SoC coming out of fabrication is essential to maintaining quality control but ATEs are complex to develop and expensive due to the sophistication of the instrumentation needed. They typically require high run-times and ample memory for testing.
Built-in self tests (BISTs) provide a simpler way to test ADCs by moving testing equipment onto the chip itself. This simplicity is not without cost, however, as BISTs greatly add to chip complexity. Two general types of ADC BIST testing exist today: dynamic testing and static testing. Dynamic testing measures the spectral response of an ADC using sinusoidal input signals to capture signal-to-noise ratio (SNR), spurious-free dynamic range (SFDR), signal-to-noise and distortion (SINAD), and intermodulation distortion (IMD). Fast Fourier transformation is used, requiring unacceptably large overhead taking up valuable chip space. Static testing measures the difference on a histogram between an ADC's real output (i.e., the translation of an analog signal to a digital output) and the ideal output for different voltage levels. Once mapped as a histogram, the actual output of the ADC is compared to an ideal staircase-like output in order to calculate the integral nonlinearlity (INL) and differential nonlinearlity (DNL) between the two. Like dynamic testing, histogram-based static testing uses an inefficiently large amount of resources on an SoC.
Moreover, traditional methods of implementing histogram-based linearity testing involve a memory of magnitude 2N or 2N−2 to be precise. The size of the requisite memory varies based on the number of hits-per-code (HPC) the ADC generates. A 12-bit ADC with average HPC equal to 200 would require 4096*8bits=32 Kbytes. To implement such a huge memory in an application-specific integrated circuit (ASIC), one has to use high density logic (HDL) memories. Yet, HDL memories have the disadvantage of not being able to read and write data in the same location simultaneously in one clock cycle. Incrementing a value in a memory location would require 2 clock cycles (1 read+1 write). Waiting for two clock cycles to increment a value in memory is not an optimal use of resources.