Data rates continue to increase in digital systems, communication systems, computer systems, and in other applications. In such applications, various devices may communicate by sending or receiving signals that are encoded with information in the form of signal levels (e.g., amplitude) in certain intervals of time. Proper decoding of periodic signals, for example, may involve measuring the signal level in the correct time interval, or period. As data rates increase, margins of error in the signal level timing tend to decrease.
In general, errors in which a signal deviates in its timing from an ideal timing is sometimes referred to as “jitter.” For example, an increase in jitter for a signal encoded with a stream of digital data may cause an increase in a bit error rate (BER) for that data stream.
Jitter may be added to a signal from a variety of sources. For example, one source of jitter may be in circuit elements used that generate, transmit, convey, process, and/or receive the signal. To various degrees, circuit elements may add jitter to the signal through cross-talk, reflections, shot noise, flicker noise, and/or thermal noise. Electromagnetic interference (EMI) may also contribute to jitter.
Typically, jitter may be measured as a total jitter. Total jitter may represent a convolution of all independent jitter components, which can include contributions from deterministic and/or random components. Random jitter, such as that caused by noise, typically exhibits a Gaussian distribution. Deterministic jitter may include periodic jitter, duty cycle distortion, and/or intersymbol interference, for example.
Jitter measurements may be made in a variety of applications, examples of which may include Fibre Channel, Gigabit Ethernet, XAUI, InfiniBand, SONET, Serial ATA, 3GIO, and Firewire. To illustrate the importance of jitter in such applications, a nanosecond of jitter in a 100baseT (100 Mb/s) device may represent a 10% data uncertainty. For example, the same nanosecond of jitter may represent a 20% data uncertainty if the data rate is increased to 200 Mb/s, or a 100% data uncertainty in a Gigabit Ethernet device.
Reliability of a communication system may influence the effective capacity of that system for communicating information. One way to measure of reliability of a data channel relates to a bit error rate (BER) for that channel. For example, if a data channel has an average of one bit error for every one thousand bits transmitted, then that channel may be described as having a BER of 1/1000, or 10^−3. If the reliability of the communication channel is degraded, data rates and/or data volume may need to be limited so as not to exceed the channel's capacity.
Examples of data channels may be found in backplanes, optical systems, wireless communication systems, and/or wired communication links. Some communication systems are configured as electronic networks that allow data to be shared (e.g., movies, files, images, voice). Data channels of various types may be connected to form networks, such as WAN (wide area networks), LAN (local area networks), and/or MAN (metropolitan area networks). When using such networks, information may be transmitted through wired data channels, such as DSL (digital subscriber line), and/or wireless data channels, such as WiMax or WiFi, for example.
As data rates increase, maintaining a specified BER in a typical data channel tends to become increasingly difficult. In some applications, increased BER leads to increased requests for re-transmissions, which may in turn reduce the effective capacity of the data channel. Accordingly, data channels with low bit error rate performance may provide higher effective data capacity.