1. Field of the Invention
The present invention relates to a semiconductor integrated circuit having an input buffer operation error preventing circuit for preventing an input signal level error detection operation at an input buffer which is caused by an output noise upon a variation in the output data of an output buffer.
2. Description of the Related Art
In a semiconductor integrated circuit, if in order to obtain a high-speed access time the drive power of a data output buffer is increased to allow a high-speed rise and fall in an output data signal, then a noise signal is generated in a power supply line (including a ground line) due to a temporary large current through the output buffer. In this case, the output noise induces an input level detection error at, for example, a signal input buffer, causing a problem as will be set forth below.
FIGS. 1 and 2 show an output buffer and input buffer, respectively, and FIGS. 3A-3E shows the state of a typical error detection operation of the input buffer at the time when the output data of the output buffer varies. That is, at a "0" level output of the output buffer, a noise signal is induced on a V.sub.SS line (a ground line) due to a drive peak current of an N-channel transistor TN in the output buffer, resulting in a potential variation. At this time, if in the input buffer an input signal of a TTL (transistor-transistor logic) level is a high level and there is a small margin in the input signal level, an input buffer of a first stage temporarily assumes the same state as upon receipt of a TTL input signal of a low level, due to the influence of a noise signal of a "V.sub.SS potential" level, causing an output node A of a first stage in the input buffer to go high temporarily. On the other hand, at a "1" output level of the output buffer, a noise signal is induced on the V.sub.DD power supply by a drive peak current of a P-channel transistor TP in the output buffer. If, at this time, the TTL input signal is at a low level in the input buffer and there is a small margin in that input signal level, the input buffer of the first stage temporarily assumes the same state as upon receipt of a TTL input signal of a high level, due to the influence of a noise signal of a "V.sub.DD " potential level, causing the output node A of the first stage in the input buffer to go low temporarily.
In order to prevent the aforementioned problem, that is, prevent an operation error of the input buffer resulting from the output noise upon a variation in the output data, the usual practice is to reduce the drive power of the output buffer and hence to reduce an amount of output noise generated or, in a memory of a multi-bit structure, to reduce an amount of output noise generated, by displacing each bit output a corresponding time little by little. These methods present a problem because they are used at the sacrifice of data read-out speed. Another method is, prior to the varying of an input at an output buffer, shorting input and output terminals of a final stage of the output buffer so that, with the output waveform made less sharp, an output noise component may be reduced. For this method, reference is made to Wada, T., et al., "A 34ns 1Mb CMOS SRAM using Triple Poly", ISSCC DIGEST OF TECHNICAL PAPERS, pp 262 to 263; Feb., 1987. According to this method, the input and output terminals of the output buffer are forced into conduction, offering a risk of inducing a large current therethrough or rather inducing a power supply potential variation. Furthermore, there is also a risk that the aforementioned conduction operation will be performed at the sacrifice of data read-out speed.