Generally, in order to reduce power consumption of a semiconductor memory device and maintain the reliability of the device, a power supply voltage used in the memory device has been continuously decreased. Accordingly, while the power consumption is gradually reduced, a range of current and voltage which should be sensed by circuitries and elements in the semiconductor memory device are more and more reduced. That is, the sensing margin of current and voltage has been gradually decreased. Therefore, it is necessary to provide more precise circuitries and elements for more minute sensing operations and, accordingly, a sensing circuitry is significantly needed in an integration circuit in order to sufficiently sense and amplify the signal.
In conventional semiconductor memory devices, there is a bit line sense amplifier (BLSA) used as sensing circuitry to sense and amplify data when the data are read out and stored in a memory cell device.
With the increment of integration of the semiconductor memory devices, higher performance of the bit line sense amplifier (BLSA) is required. However, the more the load applied to a pull-up device and a pull-down device in the bit line sense amplifier (BLSA) increases, the more time is needed to amplify the sensed data up to the power supply voltage. Sometimes, it is impossible to amplify the sensed data up to a desired voltage level. Therefore, in order to supplement this disadvantage, an over-driving circuit is employed in such a manner that an external voltage (VEXT=power supply voltage (VDD)) is used together with a core voltage VCORE, driving a pull-up line (RTO: Restore) of the bit line sense amplifier. That is, in order to improve the amplification speed of data in the sense amplifier, the pull-up line RTO is increased to the external voltage (VEXT=power supply voltage (VDD)) which is higher than the core voltage VCORE and thereafter the core voltage VCORE is applied to the pull-up line RTO.
FIG. 1 is a block diagram of a conventional bit line sensing circuit including an over-driving circuit.
Referring to FIG. 1, a conventional over-driving circuit is composed of a core voltage supplier 10 to drive a pull-up line RTO of a bit line sense amplifier 40 to a core voltage VCORE, a power voltage supplier 20 to drive the pull-up line RTO of the bit line sense amplifier 40 to an external voltage (VEXT=power supply voltage (VDD)), and a discharge unit 30 to discharge the driven voltage on the pull-up line RTO of the bit line sense amplifier 40.
However, in the conventional over-driving circuit, the over-driving operation is carried out at the same over-driving timing and the same voltage (VEXT=power supply voltage (VDD)) even though the external voltage (VEXT=power supply voltage (VDD)) to be used for the over-driving of the bit line sense amplifier is altered into a high voltage level (high_VDD) or a low voltage level (low_VDD) as compared with a predetermined potential level set to an external voltage. The power supply voltage is used as a power source to operate DRAM operation, wherein the potential levels of the power supply voltage are 3.3V, 2.5V, 1.8V, and 2.5V in SDR SDRAMs, DDR SDRAMs and mobile memories for low power, DDR2 SDRAMs, and Rambus SDRAMs, respectively.
If the high potential level (high_VDD) is applied to the pull-up line RTO of the bit line sense amplifier 40 due to the fluctuation in the exterior voltage (VEXT=power supply voltage (VDD)), a side effect of increased capacitance stress of the memory cell causes generation of core voltage noise (VCORE Noise). Dissipation of current is generated due to the unnecessary and high potential level.
Likewise, when the low potential level (low_VDD) is applied to the pull-up line RTO of the bit line sense amplifier 40, a delay in time to amplify the data stored in the memory cell up to the desired voltage level occurs because the potential level of the over-driving is not sufficient. Efficiency of the over-driving operation is then degraded.