1. Field of the Invention
The present invention concerns digital systems consisting of one or more integrated circuits and, more particularly, a digital system comprising a processing unit and at least one output buffer that drives a peripheral unit in response to signals arriving from the processing unit.
2. Description of the Related Art
As is well known, an output buffer for digital signals is an interface circuit that serves to drive a load in response to a digital control signal. A buffer is normally designed and dimensioned on the basis of specifications envisaging operation in direct current, i.e., on the basis of the maximum value of the supply voltage of the integrated circuit of which the buffer forms part and the maximum value of the current to be supplied to a predetermined resistive circuit. As a result of this dimensioning the switching speed of the signal generated by the buffer is often abundantly greater than what is effectively necessary. The switching will therefore give rise to very substantial current pulses, i.e., current transients that can lead to spurious switchings in the integrated circuit and, consequently, loss or alteration of the information associated with the digital signal. In mixed integrated circuits, which contain both digital and analog parts, the current transients can even jeopardize the performance of the analog circuits. Furthermore, the supply unit of the integrated circuit sustains a heavy load during the switchings and this can create a serious problem when the integrated circuit forms part of portable equipment, i.e., equipment with limited energy resources.
If the problems just outlined are to be avoided or at least attenuated, it is essential to design devices in which the connection paths to the supply unit have a section sufficiently large not to cause excessive voltage drops or inductive phenomena, but this implies a larger occupied area and does not solve the problem of the excessive supply unit load. Buffers having various cascading input stages successively controlled with predetermined lag times and buffers with driver circuits capable of regulating the rising and descending fronts of the signals to be transferred as outputs have been proposed to reduce the switching speed. The first solution calls for the use of a relatively large area of the integrated circuit and the second is just as complex. In both solutions, moreover, the buffer output current or, more precisely, the switching current, i.e., the current furnished or absorbed during the transitions of the digital signal provided by the buffer, varies as a function of the load. Indeed, when the load is greater than the optimal load fixed during the design of the circuit, so that the output switching time will be greater than the time that would have been required with the optimal load, the switching current increases in a controlled manner only until the end of the switching time corresponding to the optimal load, but then increases in an altogether uncontrolled manner for the remainder of the switching time. Expressed in terms of voltages, this effect manifests itself as a variation—from a low to a high value—of the slope of the switching fronts of the digital output signals.