The presence of impulse noise can occur in digital subscriber line (xDSL) systems due to electromagnetic interference from such sources as a telephone network, power system, and even from natural phenomena such as thunderstorms and lightning. The effects of impulse noise significantly limit the reliability of real-time services, such as video that can be supported by current generation xDSL systems, for example, VDSL (Very High Speed DSL). As a result, there is significant need for the development and standardization of techniques for detecting and mitigating the effects of impulse noise.
The detection of impulse noise may be performed in the time domain, frequency domain, or even jointly in the time-frequency domain. For instance, various techniques are used to detect impulse noise in OFDM (Orthogonal Frequency-Division Multiplexing) systems. These techniques include signal power based time domain and frequency domain detection, as well as frequency domain detection based on mean-squared error monitoring. Other techniques include utilizing a joint time-frequency impulse noise detector based on signal power computations performed in both domains. Other impulse noise detectors are directed to monitoring the metric used in the decoding of the inner code, while a related technique is based on comparing the hard-decision and soft-decision output of a convolutional coded modulation scheme. Other techniques involve detecting RS (Reed Solomon) codeword symbols corrupted by impulse noise and providing protection against impulse noise using RS decoding with erasure correction.
Within xDSL systems, DMT symbol-based impulse noise detectors are generally utilized to monitor DMT symbols that are corrupted by impulse noise so that statistics related to the occurrence of impulse noise events (such as inter-arrival time) can be tracked. In addition, detection of impulse noise can be utilized to trigger impulse noise mitigation schemes such as RS (Reed Solomon) decoding with erasures, blanking, or retransmission. One approach to detecting the presence of impulse noise in a DMT symbol is based on monitoring the post-FEQ (frequency domain equalizer) slicer error for a set of M tones (that are not necessarily contiguous) and comparing them with tone-dependent thresholds.
However, one perceived shortcoming with such detectors is that key design parameters are not tuned or optimized in order to meet diverse performance requirements, which include the following: (1) achieving an average BER (bit error rate) at the output of Reed Solomon decoders with erasure correction which aligns with industry requirements (i.e., a BER of 10−7 for such services as ADSL and VDSL) despite the occurrence of misdetections at the output of the impulse noise detector; (2) seamless integration of the impulse noise detector with such existing transceiver features as maintaining a certain noise margin and seamless rate adaptation (SRA) mechanisms; and (3) minimizing the complexity of the impulse noise detector. Therefore, a heretofore unaddressed need exists in the industry to address the aforementioned deficiencies and inadequacies.