Providers of digital television (DTV) services typically use two-way hybrid fiber-coaxial (HFC) networks, which are shared bi-directional networks with point-to-multipoint transmission in the downstream direction using digital signals or a mix of analog and digital signals, and multipoint-to-point transmission in the upstream direction. Signals are distributed via a fiber optic connection from a head-end to a node that converts the optical signal to an electrical signal and then distributes the signals to residences via a tree and branch coaxial cable distribution network termed ‘cable plant’. At the subscriber side, terminal equipment supports the delivery of cable services, which may include video, data and voice services, to subscribers via cable modems.
Delivery of data services over cable networks, and in particular cable television (CATV) networks, is typically compliant with a Data Over Cable Service Interface Specifications (DOCSIS®) standard. The term ‘DOCSIS’ generally refers to a group of specifications published by CableLabs that define industry standards for cable headend equipment, such as Cable Modem Termination System (CMTS), and cable modem (CM) equipment. The physical layer specification of DOCSIS provides for the use of frequency multiplexing and several specific forms of quadrature amplitude modulation (QAM) for both upstream (CM to headend) and downstream (headend to CM) communications. Upstream and downstream signals occupy separate frequency bands called upstream and downstream frequency bands. Downstream information channel signals co-propagate in the downstream frequency band, and upstream signals co-propagate in the upstream frequency band. The frequency separation of the upstream and the downstream signals allows bidirectional amplification of these signals, which propagate in a common cable in opposite directions. In the United States, most of the cable equipment installed at the time of the writing complies with the DOCSIS 3.0 version of the DOCSIS standard, which provides for the upstream spectral band from 5 MHz to 42 MHz typically, with the downstream channels using 64-QAM or 256-QAM modulation format and 6 MHz spacing within the downstream spectral band spanning from 50 MHz to 860 MHz. The upstream channel widths are configurable and may take a set of define values between 200 kHz and 6.4 MHz, each corresponding to a specific symbol rate, with the upstream data modulated with either QPSK, 16-QAM, 32-QAM, 64-QAM or 128-QAM.
The upstream and downstream signals are prone to impairments that may originate at a plurality of network locations in the network. As the result of the “tree” structure of the cable plant, there may be numerous devices, cable segments and connectors located between the fiber optic node and the end user. This provides for a plurality of locations were a defect can occur, resulting in either no service or a reduced service to the end user. In order to ensure adequate performance, the cable plant needs to be monitored and tested and the source of impairments identified and located.
Tracing the source of impairment typically requires that a technician travels to different network locations and compares measurements to locate the impairment. Portable network testing devices currently used in the industry may help to identify certain types of defects in the cable plant by performing specific spectral and noise measurements in the upstream and/or downstream directions using specialized testing methods at different network locations. A number of tests can also be performed to evaluate quality of digital TV signal transmission on higher logical levels of data transmission, for example by measuring such parameters as carrier level or amplitude, modulation error ratio (MER), bit error rate (BER), ingress under carrier (IUC), and other parameters. The measurements may be performed on channel-by-channel basis, each channel diagnostic data being summarized on a separate screen or data page viewed by the technician on the tester's visual display.
One type of defects that may be particularly hard to locate are defects that lead to changes in impedance along the signal path in the cable plant. These defects may be caused by cable corrosion, which may result from the cable getting scratched and the outer shield rusting away due to exposure to water, “rodent chews”, crushed, pinched or kinked cables, opens, shorts or partials in the cable. Impedance changes may also be caused by set screws inside housings, i.e. the screws that connect the center conductor of the cable to internal circuits of the amplifiers, splitters, taps, and fiber nodes, becoming loose if they are under-torqued or causing oxidation through the galvanic process if they are over torqued. Defects of these types may be located using time domain reflectometry (TDR), which may include launching a short pulse into the cable and detecting reflections from the location of the impedance change, with the time delay between the transmission and the reflection indicating the distance to the fault. This may, however, require that the service to the customers be disconnected during the measurements so that the strong TDR pulses do not interfere with the downstream TV signals at the end user locations, and the weak reflected TDR pulses are not obscured by the upstream DTV signals from the end users. As the service to many customers may be impacted due to the tree structure of the cable plant, cable operators are understandably reluctant to perform such measurements due to potential customer complaints. Another approach could be to replace all possibly suspicious connections, cables and/or devices hoping that the defected part is among them. Drawbacks of this approach includes increased costs and that the root cause of the problem remains un-identified.
Accordingly, it may be understood that there may be significant problems and shortcomings associated with current solutions and technologies for locating impedance-changing faults in a cable TV network.