In many communication protocols for computer and other systems, a high speed serial receiver is used to recover an incoming analog signal, e.g., received from an input line, and convert the obtained serial data stream into parallel frames. De-serialization is performed to convert the serial stream to parallel form so that it can be handled at lower speeds. In a conventional alignment process to align the serial stream to the correct frame alignment, the recovered data is accumulated and N alignment hypotheses (where N is the number of bits in a frame) are checked in digital circuitry to determine the correct alignment. Therefore, a processing latency of up to N−1 bits is introduced. The exact latency depends on the arbitrary timing difference between the two link partners, which can change on every link establishment.
A serial receiver typically includes an analog front end that processes the serial signal at high rate, a serial-to-parallel conversion block (de-serializer), and logic circuitry which processes the parallel data at a lower speed. Parallelization may be performed using a clock with an arbitrary phase. This arbitrary clock is not synchronized to a frame boundary. Therefore, when using a conventional digital alignment procedure it is required to take into account a processing latency of N−1 bits, which might be significant. As an example, in current communication protocols the frame width can exceed 100 bits. For example, the Peripheral Component Interconnect Express (PCI) Third Generation (Gen3) frame width is 130 bits and 10 GBASE-KR frame width is 66 bits.