In many applications, including digital communications, clock and data recovery (CDR) systems are employed to recover correct timing (e.g., frequency and phase) of an input data stream, which timing is then employed to sample the input data stream to recover the user data for decoding. A serializer/deserializer (SerDes) device is commonly used in high speed communications to convert data between serial and parallel interfaces in each transmit/receive direction.
The SerDes devices often employ an encoding scheme that supports DC-balance, provides framing, and guarantees signal transitions. Guaranteed transitions allow a receiver to extract the embedded clock signal in CDR, while control codes allow framing, typically on the start of a data packet. This encoding scheme also improves error detection with running disparity, providing separation of data bits from control bits, and permits derivation of byte and word synchronization.
The ability to detect the loss of an incoming signal is often a system requirement. Even in systems where loss of signal (LOS) detection is not required, it is often beneficial to be able to determine whether a usable incoming signal is being received. The existing LOS detection mechanisms use analog peak detectors in order to monitor the amplitude of incoming received serial data, compare it to a programmable threshold, and set a LOS flag when the peak amplitude falls below threshold. However, the variety of sources of attenuation in the connection media and dependence on the incoming receive data frequency content makes determining suitable thresholds difficult. Such variations, thus, generally deny generation of a threshold setting universally employed for every possible source of signal attenuation.