Conventional radio frequency (RF) time-interleaved (TI) analog-to-digital converter (ADC) circuits may rely on several background (BG) calibration algorithms to maximize its performance. However, these BG calibration algorithms may rely on the input signal to be present sufficiently long time to reach the required performance. Furthermore, if the BG calibrations, such as gain and time skew calibrations are kept working when signal is not present, it leads to divergence of the calibrations and consequently requires re-convergence when the signal returns, which may not be tolerable in some cases.
Some irregular signals in practical applications, such as burst mode signals used in cable applications and wireless communications, or unpredictable signals from in radar applications for example, can dramatically limit the accuracy of the BG calibration blocks if no intelligent control is used while receiving the data.
In conventional applications using discrete ADC devices, the control of the calibrations may be hardened in the device, making it difficult for users to configure it or tweak it as a function of the input signal characteristics and application. Furthermore, the use of JESD interfaces with discrete ADCs makes it difficult to optimize or include reactive control by an application, which leads to many sub-optimal implementations.
Further, during start-up, it may be necessary to run a foreground (FG) calibration cycle to maximize performance of the offset calibration loops. It may take a long time to settle to sufficient accuracy (e.g. >2 s) using static calibration integration coefficients, which may be too slow for some applications.
Accordingly, there is a need for receivers having improved calibration.