Power amplifiers are one of the most expensive and most power-consuming devices in communication systems. Digital pre-distortion is a technique that reduces power amplifier cost while improving efficiency. Pre-distortion refers to distortion intentionally applied to a transmission signal prior to amplification in a power amplifier. The distortion is desirably configured to be the inverse of unwanted distortion introduced by the power amplifier, so that the resulting amplified transmission signal comes out as nearly linear as possible.
With the use of pre-distortion, the linearity is improved and extended so that the power amplifier can be operated at a higher percentage of its power rating. This means that a lower-power, lower-cost linearized power amplifier can be used in place of a higher-power, higher-cost power amplifier that must be operated at a lower percentage of its power rating to achieve a desired linearity. Furthermore, the linearized power amplifier operates more efficiently. For a given output power level a lower-power amplifier operating more efficiently consumes substantially less power than an inefficient higher-power amplifier. Moreover, these benefits are even more pronounced for multicarrier applications where peak-to-average ratios tend to be large.
In general, gain and phase transfer characteristics of a typical power amplifier change as a function of the magnitude of the transmission signal being amplified. In particular, gain tends to droop and phase shift tends to increase as transmission signal magnitude approaches a saturation point for the power amplifier. Accordingly, a typical linearizer will implement pre-distortion functions that amplify the transmission signal by an amount which is a function of magnitude to compensate for gain droop, and apply an opposing-polarity phase shift as a function of magnitude to compensate for the power amplifier-induced phase shift.
Adaptive pre-distortion utilizes a feedback signal to determine the characteristics of the pre-distortion functions applied to the transmission signal by the pre-distorter. Gradient techniques have been used to compare pre-distorter input and power amplifier output values on a sample-by-sample basis in both amplitude and phase and thereby adapt the pre-distortion functions implemented by the pre-distorter over time to improve linearity. Unfortunately, the poor linearity which is inherently exhibited prior to adaptation of a pre-distorter can lead to extensive intermodulation products and significant spectral regrowth. This necessitates processing a wideband feedback signal having a bandwidth that can be many times the bandwidth of the transmission signal itself. A very expensive, high performance, analog-to-digital converter is therefore used in the feedback signal path. Such a high performance analog-to-digital converter can end up being the most expensive component in the transmitter and can greatly diminish any power amplifier cost savings gained by using pre-distortion.
Narrowband feedback results from processing only out-of-band emissions. The use of a narrowband feedback signal would permit the use of a less expensive and more desirable analog-to-digital converter in the feedback signal path. But conventional attempts at implementing adaptive pre-distortion using narrowband feedback have provided unsatisfactory results. Conventional techniques have attempted to use gradient adaptation methods similar to those used for wideband feedback. But these methods are able to converge in only specialized situations, and they tend to converge slowly. Consequently, the conventional narrowband feedback methods produce an undesirable amount of adjacent channel power.