A typical A/D converter includes a circuit (comparator) that performs a magnitude comparison between the levels of two analog signal voltages and outputs the result of the comparison as a digital value. The comparator includes a preamplifier section that generally amplifies its input and a latch section that judges whether the logic level of the input is 1 or 0. The comparator typically has an offset due to the variation in the characteristics of devices. This offset limits the accuracy of the comparator. Accordingly, calibration is performed in the background to cancel the offset.
As another typical offset canceling method, dynamic offset calibration is available in which the offset is canceled using a digital/analog converter (D/A converter) by operating a comparator under the similar conditions as those in the actual operation mode. In the dynamic offset calibration, after the completion of calibration is detected, the calibration is ended.
In a D/A converter that performs calibration for multiple current source cells in the background, a typical technology similar to that described above is available. According to this technology, errors in the accuracy of individual current source cells are canceled by providing redundancy in the number of the current source cells and by sequentially performing calibration for the redundant current source cells.
The typical dynamic offset calibration is performed under similar conditions (clock frequency and duty ratio) as those in normal operation. Since the clock frequency is generally proportional to the power source current, the power source voltage actually applied to the comparator differs depending on the resistance of the power source wiring or the like. Hence, the generation of offset or the like in the comparator is also affected. If calibration is performed under the similar conditions as those in the case of a normal comparison operation, the offset under the operation conditions is canceled properly.
In the typical background calibration, the circuit to be subjected to calibration is switched every preset specific period. When the comparator in which calibration is performed is switched, its internal circuit operates or a circuit that generates a switching signal for switching the comparator operates, whereby the source current is changed. Since the switching signal itself operates, the source current is changed.
The change in the source current affects the clock for analog signal operation, thereby eventually causing the clock timing to change. In the typical background calibration, the change in the power source current due to the switching operation of the target of calibration affects the clock generated at a constant period, and the clock timing is changed occasionally.
The change in the clock timing is a timing error. In the case that a dynamic analog signal is processed, the amount of the voltage changed in the error time becomes an error voltage. If a change in the clock timing occurs, the conversion output in the A/D conversion operation also has an error. The effect due to the change in the clock timing is produced significantly in a high input frequency range. As a result, spurs occurs at a specific frequency depending on a specific period at which the target of calibration is switched, and the characteristics are degraded.