A digital delay-locked loop (DLL) generally includes a phase detector which detects the phase difference between a system clock and a feedback clock, and causes adjustment of a time delay circuit in a loop which causes a DLL output clock to be adjusted to lock with the system clock. The time delay is generally provided by an adjustable delay line.
Since the adjustable delay line is typically adjusted in steps, the finest delay resolution depends on the delay line step increments. In order to hold the locked condition, the adjustable delay line is continuously increased and decreased in step increments around a lock point, which results in inherent tracking jitter. In order to reduce the jitter, the adjustable delay line includes a plurality of coarse delay elements (CDE), forming a coarse delay line (CDL), in series with a plurality of fine delay elements (FDE) forming a fine delay line (FDL). After power-up of the circuit, the CDL is adjusted, and once a lock point has almost been determined, the FDL is adjusted, which narrows the window or eye around the lock point, which represents a nominal amount of jitter in a typical application.
The FDL preferably includes enough steps for providing a maximum time delay which is equal to or slightly greater than a time delay of a step of the CDL. Once the DLL has stabilized to the lock point, the adjustable delay line will automatically compensate for variations in delay caused by changing temperature and voltage conditions, by varying the FDL.
In case of major drift, adjustments in the FDL will underflow/overflow its minimum/maximum delay. In that case, another CDE is switched out/in series, and at the same time the FDL is adjusted to compensate for the CDL decrease/increase to provide the same total delay as before. However, now the FDL can be used again to compensate changes without immediate danger of underflow/overflow.
It is assumed in the prior art that exchanging (or switching) a predetermined number FDL steps for a CDL step provides an equivalent delay. However, any differences between the two appear as switching jitter on the DLL output.
DLL jitter includes factors such as inherent tracking jitter, power supply noise, and substrate noise induced jitter. The inherent tracking jitter is caused by the up and down adjustments to the fine delay while the DLL is in the locked condition, and as described above, is a variation equivalent to the delay achieved through a single step in the FDL. The jitter caused by switching between the CDL and FDL elements caused by the mismatch between the elements is referred to as switching jitter. This mismatch is highly dependent on the manufacturing process, and thus is hard to predict in the design stage. As operating frequencies continue to increase, the switching jitter can undesirably reduce data eye significantly. In addition, since this switching occurs only infrequently, it is inherently difficult to detect during testing and can cause apparently randomly dropped bits when the DLL is in use in the field.
Analog techniques can be used to achieve a wide range of fine resolution tracking for various applications. In particular DLLs based on phase mixers have been shown to achieve high fine resolution tracking range through quadrature mixing. However, most analog based DLL designs employ some form of charge pumps for voltage controlled delay lines and as such they suffer from a limited resolution of the delay steps since the controlling element affects an entire delay line. In addition such DLLs often require a large acquisition time due to loop bandwidths being limited to a small fraction of the clock frequency to ensure stability of the loop. This effect also causes a poor jitter performance in analog DLLs.
Furthermore, analog DLL designs are inherently more susceptible to all sources of noise as their control variables (usually voltage) are reduced to achieve finer resolutions. In particular, synchronous dynamic random access memories (SDRAM) provide a very noisy environment for analog blocks in form of supply and substrate noise, which when combined with area restrictions in SDRAMs, sometimes preventing adequate implementation of noise prevention techniques through layout, can result in unreliable DLLs in noisy field environments.
Clearly, there is a need for an improved DLL having reduced switching jitter compared to conventional DLLs.