This invention pertains to a device and method for calibrating the power output of a mobile communications device.
In a mobile communication system, the transmit power of the mobile station is controlled to meet two sometimes competing objectives. The first objective is to maintain minimum signal quality standards. If the signal is fading, the mobile station will increase its transmit power so that the received signal at the base station meets the minimum signal quality standard. The second objective is to reduce adjacent channel and co-channel interference so that other devices also using that particular base station may communicate clearly. If the transmit power of a particular device is too high, some of the power may spill into neighboring channels causing interference with transmission from other mobile stations. Therefore, the mobile station will, whenever possible, reduce its transmit power to avoid interference provided that the minimum signal quality standard can be maintained at the 1000 level.
To effectively control the power level of the mobile station, it is desirable that the power amplifier of the mobile terminal have a linear performance over both frequency and the dynamic range of the power levels required. Unfortunately, mobile devices are the sum of several electronic components, none of which necessarily behaves linearly. Therefore, a typical mobile device will have a non-linear curve when comparing an expected power output to actual power output as seen in FIG. 1. This curve changes at each of the operating frequencies of the mobile device. To compensate for this nonlinearity, the mobile device incorporates a set of offsets (see FIG. 1) and stores them in non-volatile memory. These offsets are designed to bring the actual power output into a linear relation with the expected output. For example, where the actual power output exceeds the expected power output, a negative offset is stored to reduce the actual power output (the circled portion of FIG. 1).
In order to calculate these offsets, manufacturers typically measure the output power level at many points across both the frequency band and the dynamic power range of the transmitter. The higher the number of points, the better the accuracy (and linearity) of the resulting output signal. Where Time Division Multiplexing Access (TDMA) is used, the number of power levels is restricted, and thus the total number of points is relatively reasonable. However, where Channel Division Multiplexing Access (CDMA) is used, an infinite number of power levels may be used theoretically, resulting in effectively infinite number of points to be tested.
Complicating the problem, while the circuits used in different devices of the same product line are theoretically the same, individual variation within the parts used to create the circuits in the different devices have individual variations, which results in the offsets being unique to each device. Thus, each device must be tested individually to ensure proper calibration of the device.
Conventionally, this calibration is done with an expensive rack of equipment including an antenna connected to a receiver and transmitter, several power supply sources, and a processor (typically in a personal computer) to control the rack and communicate with the processor in the mobile device. Initially, the receiver of the mobile device is calibrated by generating a signal at a set frequency and power level and applying it to the mobile device""s antenna. The rack processor evaluates the readings within the mobile device processor and calculates an offset, which is then stored by the mobile device. This process is repeated for a number of points at different frequencies and power levels. This is not a fast process because the test equipment must xe2x80x9csettlexe2x80x9d at each frequency.
After calibration of the receiver chain, the transmitter chain is calibrated. This involves the mobile device transmitting at a number of frequencies and power levels to the antenna of the test equipment. The device communicates with the rack processor and tells the rack processor that it transmitted on x frequency at y power. The rack processor then compares this information to the frequency and power that was received at the test equipment. Again, the test equipment takes time to settle at each operative frequency and power level tested. From the comparison, the rack processor can calculate an offset, which is sent, typically by a serial communication line to the mobile device, which then stores the offset in its memory.
This calibration process can be time consuming and costly by adding test time in the factory and demanding expensive testing equipment. Given the intense competition to produce an economical mobile device, any increase in the production cost is undesirable. Thus, manufacturers try to reduce time by speeding up the measurement capability and/or the communication between the test equipment and the mobile device so that the testing is accomplished faster; or the manufacturers cut corners and test fewer points across the bandwidth and the dynamic range of the transmitter. Alternatively, the parts used to assemble the device may be made to a more exacting standard such that the devices within the product line behave identically or the parts themselves behave more linearly, so that fewer non-linear instances occur. All of these solutions have shortcomings. The first solution typically involves creating more expensive test equipment, the cost of which is then passed on to the cost of the device. The second solution increases the errors that may occur during the use of the device, especially where improper offsets are stored in the memory and the end result is poorer performance of the device. The final solution also results in a more expensive device because the cost of the more precise parts is higher.
Accordingly, there remains a need in the field of mobile communications device testing, and particularly in the field of mobile phone testing, to provide an economical method and device which reduces the time necessary to test and calibrate a mobile phone without adding substantially to the cost of the test equipment.
The present invention is a loopback module used for calibrating the receiver and transmitter chains of a mobile telephone. The loopback module is controlled by the mobile telephone during the calibration procedure. The phone transmits a signal from the phone antenna to the loopback module. The loopback module changes the frequency of the transmitted signal to create a loopback signal, which is then fed back to the phone through the antenna. Software in the phone evaluates the loopback signal to determine the appropriate offset for the transmitter chain at that frequency and power level. This process is iterated until the desired number of frequencies and power levels are tested for proper calibration. The offsets are stored in memory for later use by the phone.
An alternate use of the loopback module is a general integrity check for the phone components. A signal is generated in the phone, sent to the loopback module, and a loopback signal is received by the phone from the loopback module. If the loopback signal fits within a window of acceptable responses, then the phone is considered to be O.K. to calibrate. If the phone is outside the window of acceptable responses, then the phone is slated for further testing to determine the component which is causing the poor response. Upon location and replacement of the defective component(s), the phone is tested again until an acceptable response is acquired, at which time the phone is calibrated.