The bandwidth requirements of computing device input/output (I/O) interfaces have been increasing to meet data consumer demand. For example, high-speed serial I/O interfaces are approaching 10 Gbit. Providing reliable data communications at these speeds is often complex and challenging as inter-symbol interference (ISI), random and deterministic jitter, crosstalk and supply noise can severely degrade the signal, which results in challenges to recover the signal on the receiver side. I/O interface link establishment and other operational functions, such as equalization adaption, require a valid input signal.
Conventional signal detection methods are subject to false signal detection (for instance, detecting a signal when no signal is present) and other adverse results. For example, a typical delay-based method involves causing the system to wait to improve a likelihood that a valid signal is present. However, such delay-based methods results in longer than necessary link establishment times for link partners. In another example, an analog circuit may be used to detect an incoming signal amplitude. Such analog circuits are problematic due to, among other things, silicon variation affecting the accuracy of the thresholds. In addition, at high data rates, an input signal may have a very low amplitude, in which case cross-talk or other interference may be perceived as a valid signal or a signal with sufficient amplitude may be present may not have a correct data rate. Accordingly, signal detection techniques are needed that are capable of providing efficient and accurate signal detection for communication systems.