Subsea optical communication systems require routine monitoring to guarantee their performance and minimize potential loss of service. Monitoring generally involves an attempt to detect wet plant faults and possibly aggressive threats at an early stage. Currently established monitoring technologies include the use of line monitoring systems (LMS) to detect loopback signal peaks looped back through loopback paths associated with each undersea repeater and each terminal. The loopback signals can be either high loss loopback (HLLB) signals or optical time domain reflectometry (OTDR) signals.
When there is a change in performance along the optical path, a change occurs in the amplitudes of the loopback signals associated with the repeaters surrounding the fault location. The changes present distinct patterns which may be utilized to identify fault conditions. Such fault conditions include, for example, changes in fiber span loss, changes in optical amplifier pump laser output power, and fiber breaks.
Some approaches to recognizing fault conditions based on a corresponding fault signature include utilizing automatic signature analysis (ASA) implementing a finite state machine for pattern analysis. Unfortunately, these existing ASA-based fault analysis techniques can detect relatively large changes in the transmission system, but often lack accuracy to report small changes that may indicate degraded performance of a particular element over time. These techniques are also not able to report the values of detected faults, such as the pump output power loss, or fiber span loss, and can require multiple data set collections to average out noise in the system.