Digital communication receivers that employ a direct conversion radio all suffer from a common problem. Due to the high amount of gain at baseband, these radios can often create large undesirable direct current (DC) offset in the output signal. DC offset is a DC bias voltage that is added to the input of a circuit or amplifier. If there is no correction for this offset, the radio is rendered non-functional.
A few schemes exist for correcting these DC offsets in the radio, which usually include injecting a DC current of the opposite sign into the radio to nullify the DC offset. Previously, the source of this correction current was from a second device external to the radio; typically, a baseband processor chip.
This mechanism requires a very high speed interface between the two devices dedicated to communicating the DC offset correction values. The interface can be either analog or digital, but both options have drawbacks.
The benefit of an analog interface is that it minimizes the number of pins which must be dedicated to this interface on each chip. The drawback of the analog interface is that it requires adding a digital to analog converter (DAC) to the baseband chip which is primarily a digital chip. This adds to the cost of the baseband processor.
A digital interface has the advantage of not requiring a DAC for the baseband processor. However, a digital interface may require many pins to represent the DC correction values with enough resolution, adding cost to both the radio and baseband processor.
Most current interfaces used for communicating the DC offset correction values are analog, primarily because they are less expensive than the digital interfaces. The baseband periodically enters a calibration routine wherein it measures the DC offset of each state of the radio and stores a correction factor in the baseband processor chip. During normal receive mode, the baseband processor then feeds the correction value corresponding with the radio's current state to the radio chip through this analog interface.