A delay-locked loop (DLL) is a digital circuit used to controllably monitor and adjust the phase of a periodic digital output signal with respect to a periodic digital input signal (for example, a clock signal). In general, a DLL is a servo-mechanism in which a delay path is adjusted in order to produce a desired phase relationship between the input and output signals. DLLs have been widely used as frequency synthesizers and clock circuits in transceivers, inter-chip communication interfaces and clock distribution networks.
In a “Type I” DLL, a reference signal is compared with a delayed version of itself in performing the phase comparison and generating the output signal. A conventional Type I DLL comprises a delay line, a phase detector element and a loop filter (integrator) that are used to create an output signal that is phase-matched (i.e., “locked”) to the input signal. The phase detector and loop filter form a feedback path for controlling the length of the delay line necessary for phase matching.
One problem with obtaining an output signal that is properly locked on the input signal phase is the presence of noise in the form of jitter (δ) within the propagating signals. As is well-known in the art, “jitter” can be defined as a variability in the arrival time of an edge of a periodic signal as a result of the presence of noise within the signal (in most cases, the noise exhibits a Gaussian distribution about the expected edge). In the DLL structure, jitter may be present on both the input signal to the delay line and the output signal from the delay line. Indeed, there may be an accumulation of jitter at the output, denoted as jitter peaking, which refers to the amplification of jitter from the input as it propagates through the delay line and ultimately appears at the output. Jitter results in introducing timing errors in the output signal.