Delay circuits conventionally use resistor-capacitor ("RC") circuits to which a voltage is applied at a start of the delay period. The time required for the capacitor to charge or discharge through the resistor to a predetermined voltage establishes the delay of the delay circuit.
In the past, the voltage applied to the capacitor through the resistor has generally been the power supply voltage (V.sub.CC). In many circumstances the supply voltage may fluctuate, thereby changing the magnitude of the delay produced by the delay circuit. FIG. 1 is a graph showing the voltage on a capacitor as a function of time as the capacitor is charged through a resistor toward two different charging voltages. As seen in FIG. 1, the capacitor charges toward a voltage V.sub.CC1, and the voltage on the capacitor passes through a threshold voltage V.sub.T at a time t.sub.2. When the capacitor is charged toward a higher voltage V.sub.CC2, the voltage on the capacitor passes through the threshold voltage V.sub.T at an earlier time t.sub.1. In conventional delay circuits, a delayed output signal is generated when the voltage on the capacitor exceeds the threshold voltage V.sub.T. Thus, where V.sub.CC is the supply voltage for the delay circuit, a change in the magnitude of the supply voltage changes the magnitude of the delay at which the delay signal is generated.
It is not unusual for V.sub.CC to vary during the normal operation of a circuit. This typically occurs when different loads are applied and removed, thereby varying the impedance seen by the power supply. In this instance, if a delay circuit uses V.sub.CC to charge and discharge a capacitor to create a delay, the variation in V.sub.CC due to the changing load will create a varying delay period. Also, integrated circuits may be operable over a range of supply voltages. For example, a computer made by one manufacturer may use a power supply voltage of 3.3 volts, while a second manufacturer may use a 5 volt supply. Due to the difference in power supply voltages, an integrated circuit containing a conventional delay circuit used in the first manufacturer's computer would reach a given threshold voltage at a different point in time, producing a different delay period, than if an integrated circuit containing the same delay circuit was used in the computer made by the second manufacturer. Again, this is illustrated in FIG. 1, discussed above. However, it is generally desirable for the performance of the integrated circuit to be unaffected by the specific supply voltage value chosen by the manufacturer to power the integrated circuit.
Some delay circuits attempt to overcome these problems by applying a regulated voltage other than V.sub.CC to the RC delay circuit to produce a fixed delay period. However, the problem with this solution is that the delayed output voltage is no longer between a first voltage (usually ground) and V.sub.CC. The output voltage in these instances varies between the first voltage and the regulated voltage. This presents a problem because the circuit that receives the delayed signal usually changes states or produces a result when the delayed signal reaches close to the supply voltage. When the regulated voltage is different from V.sub.CC, the delayed signal may fail to ever reach a voltage that causes the downstream circuit to change state.
The variation in delay time is a particular problem in semiconductor devices. The accuracy of timing functions in semiconductor devices is often very critical, particularly at high operating speeds. Yet the semiconductor devices must be useable over a range of power supply voltages, and the stability of the power supply voltage may, at times, be marginal.
There is therefore a need to be able to provide a precisely controlled delay in semiconductor circuits despite variations in the magnitude of the supply voltage.