DC linear voltage regulators are circuits utilized to supply a regulated output voltage to a load circuit, and typically include an output stage transistor (e.g., a power FET) and a differential (operational) amplifier (error amplifier). The differential amplifier compares a fraction of the regulated output voltage (which is fed back by way of a voltage divider to the non-inverting input terminal of the differential amplifier) with a stable (bandgap) reference voltage that is supplied to the amplifier's inverting input terminal, and generates a gate voltage that is applied to the gate terminal of the power FET, which is connected between an unregulated voltage supply and the load. During operation, the differential amplifier adjusts (increases or decreases) the gate voltage as needed such that the output voltage is maintained at the desired regulated voltage level. For example, if R/C load conditions change such that the output voltage increases relative to the reference voltage (e.g., by way of the load circuit entering a hibernation or sleep mode), the differential amplifier reduces the gate voltage applied to the power FET, thereby adjusting (reducing) the output voltage to the desired regulated voltage level. Conversely, if the output voltage decreases relative to the reference voltage (e.g., due to the load switching from a sleep mode to a normal operating mode), the differential amplifier increases the gate voltage applied to the power FET, thereby adjusting (increasing) the output voltage to the desired regulated voltage level. By constantly adjusting the output voltage in this way, the LDO regulator maintains a constant regulated voltage across the load.
A low-dropout (LDO) regulator is a type of DC linear voltage regulator that can operate with a very small input-output differential voltage, which provides advantages over other linear voltage regulators by supporting lower minimum operating voltages, providing higher efficiency operations, and reducing heat generation. LDO regulators utilize a current source circuit to stabilize and maintain the regulated output voltage under low or zero load current conditions. The current source circuit is typically coupled in parallel with the load between the regulated output voltage and ground, and functions to draw a minimal sink current through the power FET. That is, when the load enters a standby or sleep mode (i.e., is drawing zero or a very small load current), the current source functions to draw a minimum sink current from the FET in order to maintain the desired regulated voltage across the load.
During periods of zero or low load current, the energy consumption and heat generation produced by the current source circuitry of an LDO are considered acceptable because the generated sink current serves the beneficial purpose of maintaining the regulated output voltage at a stable operating bandwidth, and also because the total amount of heat generated by the LDO is relatively small during these periods. However, under normal operating (i.e., high load current) conditions, unless the current source circuitry is disabled, the current through the power FET is higher than load current (i.e., by the amount of the sink current) without providing a functional benefit, which unnecessarily increases power consumption and heat generation. That is, during high load current conditions, the sink current drawn through the current source circuit provides no benefit in exchange for the consumed energy and generated heat because the high load current facilitates stable LDO operating bandwidth. Moreover, because the sink current flows from the FET output to ground, the amount of heat generated is proportional to the regulated LDO output voltage. As such, in circuits requiring high regulated voltages, heat dissipation in the current source is a significant factor in overall LDO heating, and thus may become a critical factor limiting overall performance of the LDO circuit. Accordingly, although the use of current source circuitry is beneficial during periods of zero or very small load currents, the current source circuitry effectively becomes a liability by undesirably consuming energy and generating heat during periods of high load current.
To reduce power consumption and to avoid possible overheating problems, LDO regulators typically include a mechanism for turning off the sink current source during periods when the load consumes more than the minimum sink current (i.e., when the load is in a normal operating state). Prior art approaches used to turn off the sink current source during high load current conditions use control circuitry to monitor (sense) the load current (or LDO output voltage), and to turn off the sink current source when the load current is higher than the minimum sink current (or when the output voltage falls below a minimum voltage level). A problem with these prior art approaches is that the control circuitry remains active (i.e., continuously draws current) in order to monitor the load conditions. That is, the prior art solution control circuitry continues to draw operating current through the output stage/amplifier at all times in order to continuously monitor the load current, so even when the bias current provider/transistor has been turned off because the load current is greater than the minimum sink current, the control circuitry continues to generate heat and to draw a significant amount of power that reduces battery life in portable devices. Moreover, the complicated control circuitry of the prior art approaches requires a significant amount of chip area, which increases production costs.
What is needed is a linear regulator having an self-adjustable sink current bias source that reliably draws a sink current through the output stage during zero or low load current conditions, and that reliably turns-off the sink current to reduce power consumption and heat generation in the output stage during high load current conditions without requiring a complicated and continuously active control circuit.