In power amplifier design, there is a trade-off between efficiency and distortion. Amplifiers that operate under “Class A” conditions create little distortion but are inefficient, whereas amplifiers operated under “Class C” conditions are reasonably efficient but introduce significant signal distortion. For example, “Class C” power amplifiers often operate in a non-linear fashion whereby input signals are distorted at a power amplifier's output when operated near the power amplifier's peak output. While both efficiency and distortion are important considerations in amplifier design, efficiency becomes increasingly important at high power levels. Because of this, designers of many modern transmitters elect to accept some non-linearity in their power amplifiers to obtain good efficiency.
To attempt to limit this non-linearity and its corresponding distortion, various linearization techniques are used in conventional approaches. Conventional linearization techniques can be broadly categorized as feedback, feed-forward, or pre-distortion. The last mentioned technique, pre-distortion, intentionally distorts the input signal before the power amplifier to compensate in anticipation of the expected non-linearity of the power amplifier. According to this technique, linearization is achieved by distorting an input signal according to a pre-distortion function in a manner that is inverse to the amplifier behavior. The pre-distortion technique can be applied at radio frequency (RF), intermediate frequency (IF), or at baseband.
Existing pre-distortion techniques are less than optimal, however, and there is a need for power amplifier systems that provide improved pre-distortion functionality.