An Analog-to-Digital Converter (ADC) converts a continuous analog signal at its input and quantizes the analog signal into a set of discrete levels by sampling the input analog signal at a (typically, a constant) sampling rate, by a sample clock. Many mixed signal systems entail ADCs to digitally process an analog input signal for digital processing
Practical ADCs introduce spurious products and noise to the measured data that are deterministically related to the ratio of the input signal frequency and the sample clock. An ADC sampling at a given rate has regions of the input frequency spectrum that are optimal for in-band Spurious-free dynamic range (SFDR), and other regions for which many of the spurs fall within the band of interest. SFDR is the strength ratio of the fundamental signal to the strongest spurious signal in the output. It is also a key parameter used to specify ADCs and digital-to-analog converters (DACs).
Moreover, a ADC typically includes a comparator, the settling time of which that is needed to compare the input analog signals is not constant, but varies greatly depending on the difference between the input and the feedback signal. Typically, the settling time of the comparator is greatly dependent on its gain-bandwidth, for instance, the settling time of the comparator is short for large differences between the input signal and the feedback signal of an ADC.
Some prior approaches to address these problems with ADCS include suppressing the noise and deterministic spurious signal with added signal processing circuits or choosing a specific frequency plan and sample rate to place the operating band within favorable region for the ADC spurious performance, which requires an additional conversion stage.
Accordingly, there is a need for a control circuit that provides a mechanism for a control circuit to dynamically tune an ADC sample clock to optimize the performance of the ADCs for the signal received in real time.