Transmission speeds of digital data are constantly being pushed to the next limit. Everybody wants their computer or smartphone to run faster, or wants to download larger data files quicker than ever before. Recognizing that speed is not the only concern, it is equally important that the received data actually matches what was sent. All data gets broken down and converted back to simple 0's and 1's. Each byte that is sent through digital communication, no matter how physically far it may travel, must be readable without error and without loss at the receiving end.
A clock signal is produced by a clock generator, comprising a resonant circuit and an amplifier. The clock signal oscillates between a high and low state, usually at a fixed frequency. There are many clocks inside of microprocessors used for many different functions. The reading of a digital data transmission is triggered by either the rising edge, or falling edge of a clock signal.
In synchronous data transmission, no start or stop bit is typically used, but rather transmission speeds are synchronized at both the sending and receiving end using clock signals. While the data transfer is quicker without using stop and start bits, eventually the clocks will get out of sync and some bytes will become corrupted. The data and clock signals can get delayed differently due to environmental factors, such as temperature drift or maximum phase jitter. Jitter is a deviation from the true periodicity of an oscillating signal in frequency, amplitude or phase. Jitter may be created by electromagnetic interference and crosstalk with carriers of other signals. Jitter may be reduced through the use of filters and buffers, but can still be a problem. Current solutions for clock delays require frequent re-synchronization of the clocks, the use of a check or parity bit, or the use of a much slower data speed compared to the clock speed during digital transmissions.