In modern radio frequency (RF) transceiver systems, the reduction of current consumption is one of the main design targets. The main sink of current in the transceiver chain is still the power amplifier (PA), where much effort is spent in hardware design to achieve an acceptable compromise between current consumption and linearity over many (environmental) conditions, e.g. temperature, frequency, voltage standing wave ratio (VSWR), etc.
To further reduce the power amplifier's current consumption, adaptive (software) algorithms may be used, which can overcome certain hurdles in hardware design. Examples for these algorithms are digital predistortion to increase the linear output power range or bias point adjustment to adapt the linearity according to the specification.
A target for such algorithms is to bring the linearity of the PA as far as possible to the specification limit, and thus save current consumption.
Therefore, in an RF transceiver the measurement of linearity or signal distortion (as specified in the communication standard) is one of the major requirements to be successful with adaptive algorithms.