Jitter is the difference between an expected occurrence of a signal edge and the time the edge actually occurs. Jitter may also be expressed as the movement of a signal edge from its ideal position in time. Jitter is introduced when the signal is propagating from a transmitting device to a receiving device. At the receiving device, information is extracted from the signal by sampling the signal at specific instants of time, referred to as sampling instances. Ideally these sampling instants would always occur at the center of a data bit time, equidistant between two adjacent edge transition points. The presence of jitter changes the edge positions with respect to the sampling instance which can induce errors and loss of synchronization. In a communications system, the accumulation of jitter will eventually lead to data errors.
Jitter appears as two distinct types: random jitter and deterministic jitter. Random jitter is caused primarily by device thermal noise. Deterministic jitter can be caused by power supply fluctuations, power line noise, cross-talk, and duty cycle distortion (asymmetric rising and falling edges).
Data dependant jitter (DDJ) is a component of DJ that causes timing errors which vary with the data pattern used. Timing errors caused by DDJ in turn produce duty-cycle distortion or intersymbol interference (ISI). DDJ often results from component and system bandwidth limitations and signal attenuation. The higher frequency components of the transmitted signal have less time to settle than the lower frequency components and are attenuated more quickly. This leads to changes in the start conditions of the signal edges and produces timing errors that are dependent on the data pattern being applied. Current methods, such as those performed in an equalizer, for reducing DDJ involve amplifying the high frequency components of the signal before transmission.