Radar systems generally produce a radio frequency signal that is transmitted in a desired direction using a high gain, narrow beam width antenna. A small portion of the transmitted signal is reflected back from the target in the antenna beam and is received by the radar, generally using the same antenna that transmitted the signal. The time delay between the transmitted signal and the received "echo" is a function of the two-way distance from the radar to the target divided by the speed of light. This time delay is about two nanoseconds per foot.
In the case of radar systems requiring short range and high resolution, frequency modulated continuous wave (FM-CW) radars are preferred. In these radars, the transmitted signal is swept in frequency during the detection interval (typically a few milliseconds) over a fairly wide frequency bandwidth (typically several hundred megahertz). Also, transmit and receive functions occur simultaneously in FM-CW radars. In traditional pulse radars the transmit and receive functions are not simultaneous. Because of the finite time delay to the target and back the echo from a target is slightly different in frequency from what is currently being transmitted. The process of mixing the transmitted signal with the echo in the radar receiver generates a beat frequency that is proportional to the target range if certain criteria are met.
There are two factors related to the radar transmitted signal quality that are very critical for optimum performance. These are linearity of the frequency sweep vs. time and phase noise, sometimes referred to as phase jitter. For example, assume the radar sweeps 500 MHz of bandwidth in two milliseconds, which corresponds to 500 Hz every two nanoseconds, thus 500 Hz for each foot of range. A target 100 feet from the radar would generate a beat frequency of 100.times.500=50 kHz. If the frequency vs. time slope deviated from its nominal 500 MHz per two milliseconds during the sweep, the beat frequency would deviate proportionately.
This nonlinearity would have two deleterious effects. First, it would lower the signal-to-noise ratio because the signal was "smeared" over multiple frequencies. Second, it would make it difficult or impossible to separate or "resolve" multiple targets that were closely spaced, negating the advantages of using a wide bandwidth in the first place. Thus, it is important to maintain strict linearity. One example of a circuit that improves linearity by using a single her phase lock loop is found in U.S. Pat. No. 5,374,903, which is incorporated herein by reference. However, in this approach phase noise remains a problem.
The second factor, phase noise control, is an integral process with linearization of the frequency sweep. The effect of high phase noise is that radar returns from close-in targets or reflections from the antenna mix with the transmitted signal to raise the noise floor, thereby making it difficult, or impossible to reliably detect small targets at long ranges. A device known as a voltage-controlled oscillator (VCO) is the basic element used in most FM-CW radars to generate the desired swept frequency output. Because of the need to tune over wide bandwidths to achieve the desired range resolution, microwave and millimeter wave VCOs are inherently noisy and often limit the radar performance. It is thus desirable to reduce the phase noise of the free-running VCO by phase locking it to a lower frequency, much lower noise source. By reducing the phase noise at offset frequencies in the range of interest, the performance of the radar can be increased to levels limited only by thermal noise instead of being phase noise limited.