As is known in the art, phased array radar systems in general must be calibrated in order to operate effectively. Since a phased array radar typically uses many active and passive components to form its aperture vector (amplitude and phase) state, these must be aligned in order to form a high efficiency radiation beam. While various methods are available to provide the initial calibration state, to correct for variability (recalibration), and to identify failed components for replacement, calibration often can be verified by or controlled through either near field or far field sensing methods.
In one known technique, there is calibration of the individual components in the chain extending from the antenna beam port. After calibration in the factory, these components are assembled in the field to produce an initially calibrated aperture. However, this method of calibration has a number of drawbacks, including for example, reliance on a system to catalogue the calibrated components in the beam formation chain, such that mistakes have the potential to produce an uncalibrated system. In addition, in this method there is no means of verifying the calibrated state, such that whenever a vector state error is produced, which can be due to assembly error, natural component degradation, or error, it cannot be directly detected.
Phased array radar calibration in the field requires relatively long RF cables to complete a test loop for some conventional calibration techniques. Amplitude and phase variations of the long cable due to temperature fluctuation and cable movement over time can corrupt the amplitude and phase measurements and degrade the phased array radar calibration accuracy. During temperature cycling tests, the amplitude and phase errors of the long test cables can significantly corrupt phased array calibration data.
In some known phased array calibration techniques, the temperature around the test cables and test setup is controlled. Alternatively, expensive RF cables and components, which are insensitive to temperature or cable movement and bending, can be used. However, these methods do not meet accuracy requirements over large temperature variations. As temperatures rise or fall, the internal cable temperature tends to heat up or cool down more slowly than environmental temperature changes. This results in cable temperature change lagging the outside temperature change, which can cause significant cable phase uncertainty for the same environmental temperature. For a long cable, the time constant for cable temperature stabilization is quite long (minutes or even in hours) rendering accurate calibration in the field challenging.