The semiconductor industry has experienced rapid growth due to continuous improvements in the integration density of a variety of electronic components (e.g., transistors, diodes, resistors, capacitors, etc.). For the most part, this improvement in integration density has come from repeated reductions in minimum feature size, which allows more components to be integrated into a given area. As the demand for even smaller electronic devices such as central processing units (CPUs) has grown recently, there has grown a need for reducing the voltage rating of semiconductor devices fabricated on a shrinking process node.
As new CPU design and manufacturing technologies have been employed, new generation CPU devices are capable of steadily operating from a voltage as low as approximately 0.9V. Such a low supply voltage allows CPU devices to be fabricated in a 1.8V CMOS process. On the other hand, some peripheral devices such as input/output (I/O) interface devices still operate from a higher voltage supply (e.g., 3.3V). When a logic signal is forwarded from an I/O interface device to a CPU, the mismatch between the supply voltages of two devices may cause a reliability issue. More particularly, the logic signal having a high voltage (e.g., 3.3V) may exceed the maximum voltage (e.g., 1.8V) to which the CPU is specified.
Conventional voltage level shifting devices are employed to shift a voltage level up when a logic signal is forwarded from a CPU to an I/O device and shift a voltage level down when a logic signal is sent from an I/O device to a CPU. A variety of voltage level shifting devices have been adopted to convert an input voltage signal to an output voltage signal within a range suitable for a device operating from a different supply voltage.