The present invention relates to an analog signal delay circuit requiring a reduced quiescent power consumption.
In electronic circuits, the need often arises for delaying the generation of a signal. For example, a signal that is used to enable a microprocessor must be delayed, until after the microprocessor has reached a desired level of electrical stability.
One of the most widely used analog techniques to effect a signal delay is to force the signal into a wait state until an initially discharged capacitor is charged to a minimum required voltage. The charging time of the capacitor provides the needed delay. According to this technique, when a level transition on a target signal is detected, a current source charges an initially discharged capacitor and, at the same time, a comparator compares the voltage across the capacitor to a reference voltage. When the voltage across the capacitor exceeds the reference voltage, the comparator switches state and causes a delayed replica of the target signal to be generated. Thus the delay period is the length of time required to raise the capacitor voltage to the reference voltage. Therefore, both the capacitance of the capacitor, the magnitude of the current used to charge the capacitor, as well as the value of the reference voltage directly affect the delay period.
The prior art techniques for generating a signal delay, however, suffer from two major disadvantages. First, the magnitude of the current supplied by the current source must be made relatively large to compensate for any leakage current that is present within the circuit. Moreover, because the capacitor is typically chosen to have a relatively large capacitance (in order to minimize the effects of the various parasitic capacitances), a relatively large current is required to charge the capacitor. Second, the current source remains active at all times, thereby, resulting in a relatively large quiescent power dissipation.