High-speed and high-accuracy analog-to-digital converters (ADCs) are used in many electronic architectures and solutions. For example, high-speed, high-accuracy ADCs are used in digitally modulated radar systems, communication systems, and other environments. Design of such ADCs is challenging due to speed and accuracy requirements particularly for low power solutions. Further, with semiconductor processing using 40 nanometer (nm) processes and below, the difficulty in designing of such ADCs increases significantly.
Typical designs that are high-speed (e.g., sample rates of 1 gigahertz (GHz) and above) use time-interleaved stages to obtain composite sample rates that achieve desired speeds and throughput. The use of these time-interleaved stages, however, is susceptible to errors in the sampling process, require tightly controlled clock skew between the time-interleaved stages, and often require tuned delays between interleaved paths. In addition to skew, gain and offset errors within the time-interleaved states introduce spurious, unwanted noise. The sampling process errors and gain/offset errors degrade the output spectrum of these prior time-interleaved ADC solutions. As such, the digital conversion provided by these prior time-interleaved ADC solutions can suffer from accuracy degradation that degrades overall system performance.