Mobile terminals such as cellular phones have become ubiquitous in modern society. Mobile terminals rely on sending an electromagnetic signal through the air to a base station and receiving electromagnetic signals through the air from the base station. An unfortunate side effect of the convenience of this wireless communication is that the signal-carrying electromagnetic radiation that forms the backbone of the communication may interfere with other electronic devices. This phenomenon is known as electromagnetic interference (EMI) or electromagnetic compatibility (EMC).
While interfering with other electronic devices like a computer or television is problematic, it is also possible for multiple mobile terminals operating in proximity to one another to have cross channel EMI. That is, one mobile terminal may be transmitting in a first channel, but some of the signal may spill over as noise into channels that are nearby in the frequency spectrum and on which a second mobile terminal is trying to operate. This spill over transmission is known by various terms, but is termed herein as “side band transmission.”
To combat EMI in the United States, the Federal Communications Commission (FCC) has promulgated standards for emissions that limit how much radiation may be radiated within certain frequency bands. On top of the FCC emissions rules, the various communication protocols used by mobile terminals may impose more restrictive limitations with specific attention paid to side band transmission levels. For example, Annex A of the Global System for Mobile Communications (GSM) 05.05 version 8.5.1, released 1999, indicates that the maximum allowed signal for spurious side band signals is the larger of −60 dBc or −36 dBm. This measurement is to be averaged over at least two hundred transmit power cycles.
Against the backdrop of these standards, many mobile terminals incorporate DC—DC converters in their internal circuitry to change a DC voltage level of a battery to a lower or higher DC voltage level depending on the needs of the internal circuitry of the mobile terminal. A common method to implement a DC—DC converter uses a switched mode power supply that includes a switch that opens and closes at a predetermined frequency according to a clock signal. Such switched mode power supplies exhibit a periodic ripple in their output at the switching frequency. If the DC—DC converter is used to provide a supply voltage (Vcc) to a saturated power amplifier, this ripple may mix with the radio frequency carrier to generate spurious side band signals.
To combat this ripple, manufacturers use various approaches. For example, some manufacturers use low drop-out linear regulators for power control associated with power amplifiers instead of the switched mode power supplies. This substitution avoids the ripple issues, but does so at the expense of decreased efficiency and shorter battery life.
A drive train of a switched mode power supply typically includes a series inductor and a parallel capacitor. In order to reduce the ripple in the output of the switched mode power supply, some manufacturers use large inductors and capacitors. However, the large inductors and capacitors are physically large and require a significant amount of die area. In addition, large inductors and capacitors are more expensive and result in a low Unity Gain Bandwidth (UGBW). The low UGBW makes polar modulation impossible and may cause turn-on mask problems.
Another method to reduce the ripple is to operate the power amplifier receiving the output of the switched mode power supply in linear mode. However, running the power amplifier linearly significantly reduces the efficiency of the power amplifier and negates the efficiency of the switched mode power supply. Yet another method used to reduce the ripple is to use a multiphase converter. However, each phase requires a separate inductor, thereby adding cost and size. Thus, there remains a need for an improved system and method for reducing the ripple in the output of a switched mode power supply.