The primary function of a Delay Locked Loop (DLL) is to provide a way to divide time up into stable, equal pieces. This is done by taking a variable delay line and locking its delay to a stable time period (generally a phase or period of an input clock). Due to the construction of the DLL, an aliasing problem arises in locking the delay line to the input clock reference. In general, if the delay line starts at a delay that is sufficiently longer than the reference time period, the delay line can falsely lock to twice the reference delay or simply fail to lock entirely. As such, most DLLs lock by starting at the minimum delay setting and then letting the delay line slow down, gradually extending the delay setting, to match the reference period.
By initially forcing the delay line to a minimum delay, it is required that the circuits around the DLL be able to keep up with the high speed edges and pulses that the delay line produces. This can require that the support circuitry around the DLL operate at much higher data rates than is otherwise required. This will generally cause the support circuits to consume much more power during the lock in period of the delay line and even after the delay lock is achieved due to the need to overdesign for a worst case start at minimum delay. Another challenge to operation of the DLL at minimum delay is pulse evaporation, which occurs when the input to the delay line toggles faster than the delay line can handle.