High-speed ADCs are often implemented by associating several sub-ADCs in parallel and operating them in an interleaved mode to effectively multiply the conversion speed of the sub-ADCs. For example, by associating four sub-ADCs the speed is effectively increased by a factor of four.
The phases of the sampling clocks for an interleaved ADC comprising n sub-ADCs should theoretically be precisely 2π/n apart from each other and the sub-ADCs should be matched in gain and offset. Errors in any of those parameters commonly cause spurs at the output spectrum that degrade the spurious-free dynamic range, SFDR, and the signal-to-noise ratio, SNR. Known implementations of interleaved ADCs may include calibration of the offset and gain of the sub-ADCs. However, for modern applications the effect of phase mismatches may be one of the dominant factors limiting the ADC performance. For example, the sampling time mismatches of sub-ADCs may be required to be smaller than 1 ps in order to avoid a significant degradation of performance.
The sampling phases of the sub-ADCs are commonly generated by a master clock generator. Due to variations in fabrication processes, the clock circuit paths corresponding to the sub-ADCs may exhibit slightly different delays, which prevents the sampling instants from being exactly 2π/n apart from each other.
Known timing mismatch calibration techniques typically require a significant amount of time to converge, impose limitations to the input signal bandwidth, or depend on input signal properties. Some solutions use adaptive digital filters with a large number of taps, which results in high power consumption and complexity, and limit the achievable sampling frequency. It is desirable to overcome these limitations.