1. Field of the Invention
The present invention relates generally to integrated circuits and, more particularly, to a method and circuit for reducing delay variations in CMOS circuits.
2. Description of the Prior Art
In many integrated circuits, the performance of a CMOS device varies with the voltage supply, temperature and process conditions or states. The speed of the circuit is generally faster as the supply voltage is increased. On the other hand, the circuit speed is generally slower as the supply voltage is decreased. As the supply voltage is increased, the temperature is reduced, and the operating process state is at a faster setting, the CMOS device tends to have an improved performance or a lesser propagation delay. On the other hand, with the increase in temperature, reduction in the supply voltage, and in the shifting of the operating process state to slower setting, the threshold voltage of the CMOS device is thereby increased; therefore, a negative impact on the performance of the corresponding integrate circuit, and more particularly, in designing a delay lock loop coarse delay step is resulted.
FIG. 1 is a block diagram illustrating a conventional constant voltage supply for reducing delay variations with the supply voltage. In this conventional design, the supply voltage to the CMOS delay is to remain constant. However, the CMOS delay remains having to vary in accordance with temperature and process variations.
The problem of delay variations in DLL design is well-known in the art, and there are a number of commonly-used solutions for combating it. One solution is to provide a common mode amplifier circuit, which uses pull up resistor and tail current to control the variations in temperature, process and supply. Another scheme is to generate local supply for each delay step unit. Many of the commonly-known methods for overcoming delay variation in DLL design, however, have significant drawbacks such as increased die area and power consumption.