Over the last years battery powered applications (like smartphones, tablets and notebooks) increased their computing power, screen resolution and display frame rate and added connected standby modes. The increase of computing power was enabled by silicon technology in the sub-micron range approaching 10 nm and below. These ultra-narrow gate structures exhibit increased leakage current for each transistor. In view of the fact that CPUs (central processing units) and GPUs (graphical processing units) are composed from multiple hundred million transistors, the leakage current of a modern microprocessor is typically significant. To reduce battery consumption, the embedded computing cores are typically disconnected from the power supply as often as possible. As a result of this, the required computing power is provided within short bursts of operation at maximum speed. Hence the power profile of a modern mobile computing device is dominated by relatively long periods of standby currents in the mA range, interrupted with pulses of high peak currents (in the 10 A and higher range). The challenge for a power management unit is the provision of low currents at high conversion efficiency (to optimize battery life time), combined with the provision of high currents without saturation effects and at stable output voltages.
One solution to avoid saturation (and high I2R losses (i.e. resistive losses)) in the current trace from the battery is the usage of a battery pack with cells connected in series. For LiIon/LiPoly cells this results to nominal battery pack voltages of approximately N×3.7V (with N being the number of cells), e.g. 7.4V for a pack with N=2 cells in series (2S) or 11V for a pack with N=3 cells connected in series (3S). The dominant current consumption is typically caused by the processor, comprising transistors that can only sustain voltages at or even below the 1V range. This triggers substantial voltage conversion ratios that cannot be provided efficiently by DCDC converters build with relatively large switches having a relatively high voltage rating and consuming a relatively high gate charge for each switching operation. As a result of this, standard buck converters tend to switch at relatively low frequency, therefore requiring coils with high inductance to provide a reasonable current ripple. These high inductance coils are typically large, especially when high peak currents need to be provided.
One way to address the above challenge is the creation of an intermediate rail (at an intermediate voltage) in between the output voltage of the battery pack and the processor input voltage. This enables the usage of DCDC converters for regulation with switches having a reduced voltage rating and switching at an increased frequency (at unchanged switching losses). As a consequence of an increased switching frequency, the inductance may be reduced for unchanged current ripple. The reduced inductance reduces the DC resistance of the coil and the size of the coils, thereby allowing higher peak currents and a reduced footprint.
Hence, mobile computing devices may make use of e.g. a 5V or a 3.3V intermediate bus. However, the usage of standard (inductive) buck converters having power capabilities that are larger than the processor peak load suffers from the above mentioned limitations, i.e. the converter is either relatively large or provides relatively poor light load efficiency. The cascaded overall efficiency is typically only acceptable if the intermediate bus is generated without any substantial conversion losses (especially during the time dominant light load operation).
If the intermediate bus is allowed to follow the battery pack voltage an unregulated capacitive voltage divider may provide relatively high efficiency for a wide current range and without the need for bulky inductors. Using e.g. a 2:1 converter the output of a 2S battery pack can be converted to the typical voltage range of a 1S pack, enabling the usage of standard low voltage PMICs (power management integrated circuitry). However, the missing regulation can lead to problems, in cases where the battery pack is deep discharged (e.g. towards 5V in case of a 2S battery pack). As the converter provides a fixed 2:1 conversion ratio, the reduced battery voltage directly impacts the voltage at the intermediate bus. The voltage at the intermediate bus may drop further (e.g. by 100-300 mV) in case a load current is pulled. As a result of this, the voltage at the intermediate bus may be below the minimum input voltage required by a cascaded 1S PMIC.
Regulation of the intermediate voltage at the intermediate bus may be provided by using e.g. a 3-Level Buck Converter. As long as the conversion ratio of such a converter is in the range of the embedded capacitive divider (providing e.g. Vin/2) the efficiency of a Multi-Level Converter is only slightly below the efficiency of an unregulated capacitive voltage divider using similar switches. However, the maximum output current of a Multi-Level Converter is limited by the current rating of its inductor. To increase its peak current capability such a converter needs to use larger inductors or route the total current through multiple inductors. The drawback of this approach is an increased BOM (bill of material) cost and an increased PCB (printed circuit board) area.
Another limitation of Multi-Level Converters is light load efficiency in case the conversion ratio is not in the range of the conversion ratio of the inherent capacitive divider. Using e.g. a converter with 2:1 conversion ratio of the embedded capacitive divider for a conversion of 3:1 (e.g. for a 3S battery pack) may result in acceptable efficiency at high current (where resistive loss is dominant), but typically shows poor efficiency at light load, where converter switching losses and inductor core losses determine the achievable efficiency.