This invention relates to noise reduction in integrated circuits and, more particularly, to a method and apparatus for reducing the transient noise generated during operation of the output drivers of an integrated circuit.
Digital data processing systems are typically built from integrated circuits that incorporate on a single chip thousands of binary circuit elements such as registers, logic gates and input and output circuits. Generally, an integrated circuit chip processes a number of parallel channels of data, e.g. eight, sixteen or thirty-two. The integrated circuits are mounted on printed circuit boards having conductive paths that interconnect the input and output circuits on the integrated circuit chips and supply thereto operating power supply voltage. An integrated circuit chip is conventionally incorporated into a package that has a number of leads connecting the integrated circuits on the chip to the printed circuit board. Since the output driver channels of an integrated circuit chip are connected through the package leads "off chip" to the conductive paths on the printed circuit board, they must be designed to drive a large capacitive load.
An integrated circuit chip has parasitic inductance, called package inductance, which is representative of the inductance of the conductive paths from the integrated circuit chip to the power supply potentials on the printed circuit board in which the integrated circuit chip is mounted. Usually, the internal conductive path in the package between the integrated circuit chip and the package lead comprises most of the package inductance.
It is common practice in the design of integrated circuit chips to employ a single package lead for the connections of all or many of the parallel channels to the power supply potentials off chip. In such case, the output drivers of all the channels driven toward a power supply potential during any single switching interval contribute to the transient noise due to package inductance.
During the past several years, there has been increasing awareness of the problems resulting from the transient noise generated within large scale integrated (LSI) circuits, and in particular the noise associated with CMOS logic. The major problems occur from the simultaneous switching of multiple output drivers designed to drive high capacitance loads, such as the data bus of microprocessors or the address bus of a large memory array. Under most conditions, the transient current at power supply nodes resulting from switching multiple output drivers, tends to be additive. Therefore, even a small value of package inductance produces serious noise problems.
The problem has become more acute and apparent, because evolving technology tends to produce LSI circuits with higher performance. By decreasing the time to perform logical operations, by decreasing the delay of the output driver, by increasing the effective data rate on the outputs, or by increasing the specified load capacitance, then, more charge must be transferred in a shorter interval of time. Often, during the design of a conventional MOS output driver, the design engineer assumes the worst case model parameters of the fabrication process and the worst case operating conditions. To realize the worst case maximum specified delay the designer must utilize a large W/L on the output MOS devices to drive the load capacitance at high temperature, minimum power supply voltage and slow process parameters. However, the worst case noise is produced at low temperature, maximum power supply voltage, and fast process parameters. Under worst case noise conditions the delay decreases and the noise increases often by an order of magnitude. Several techniques attempt to control or limit the worst case rate of current change (dI/dt) during the transient on the output. However, most techniques incur the penalty that the worst case delay must increase.