1. Field of the Invention
This invention is directed to a performance evaluation analysis in optical systems, and more particularly to a distortion measurement procedure using noise loading.
2. Background of the Invention
Optical signals suffer degradation between the transmitter and receiver from such factors as noise, inter-symbol interference, fiber dispersion, non-linearity of the elements and transmission medium, etc. In addition, in amplified wavelength division multiplexed (WDM) systems, the transmission characteristics vary from one channel to another due to the non-flat gain and noise profile of erbium-doped fiber amplifiers (EDFAs).
Distortion is defined as any inaccurate replication of a signal transmitted over a communication link, and could be referred to any network element (NE) along the link. It can be measured by assessing the difference between the wave shape of the original signal and that of the signal at the network element of interest, after it has traversed the transmission link.
In the last decade, transmission rates of data signals have increased progressively, which demands faster and more sensitive transmission systems. For transmission at high rates, such as 40 or 80 Gb/s, the distortion of the optical link is a critical parameter. With various types of dispersion shifted fiber, dispersion compensating fiber and dispersion compensating filters that make up a given link, determining distortion is no longer a simple operation, especially in optical transmission systems with in-line optical amplifiers. System performance degradation caused by noise and optical path distortions are usually difficult to separate, making the performance evaluation complicated.
In the evaluation of the characteristics of an optical fiber communication system, the bit error rate (BER) has usually been used as a parameter for performance evaluation. BER is defined as the ratio between the number of the erroneously received bits to the total number of bits received over a period of time (a second). A number of codes have been provided in the signal at transmitter for error detection, the basic idea being to add redundant bits to the input data stream over a known number of bits. The BER calculated by the receiver includes information on all impairments suffered by the signal between the transmitter and receiver, i.e. both noise and distortion information.
Performance of an optical system is also defined by a parameter called Q. The Q value indicates the signal-to-noise ratio of the electric signal regenerated by the optical receiver, and is defined as follows: ##EQU1## where .mu..sub.m is the mean value of the `1`s, .mu..sub.s is the mean value of the `0`s, .sigma..sub.m is the standard deviation of the level of `1`s, and .sigma..sub.s is the standard deviation of the level of `0`s. In the absence of distortion, Q entirely represents the bit error rate (BER) performance of the system, and this property is used in the present invention.
Optical systems have very low BERs under nominal condition of operation, and therefore measurement of BER is time consuming. In a system having a transmission rate of 5 GB/s for instance, a minimum of six hours is needed to measure a BER of 10.sup.-14 or below. It is also evident that the BER may vary significantly during this long period of time. Thus, if the BER decreases to 10.sup.-10 for a short period even, the mean value of the BER over the above six hours will never reach 10.sup.-14, making the measurement unreliable.
U.S. Pat. No. 5,585,954 (Taga et al., issued Dec. 17, 1996 and assigned to Kokusai Denshin Kabushiki Kaisha) discloses a method for measuring the Q factor as a performance evaluation parameter for a transmission system. The performance evaluation according to this patent is based on the assumption that there is a one-to-one correspondence between BER and Q at the decision threshold (reference voltage) for an optimum BER. However, the patent does not account for the distortion, and also is concerned with reducing the time necessary for measuring Q and obtaining real-time Q values, rather than to separating the noise and distortion contributions to the errors along a transmission path.
Signal-to-noise ratio (SNR) is a parameter that represents noise only characteristics of a system. In non-optical systems, where envelope-detection (square-law detection) is not necessary, the noise is generally independent of the signal level, and as such, in the absence of distortion, SNR is the only determining parameter for BER performance of the system. In these systems, an AGC (automatic gain controller) may be used to compensate for variations in the received power.
On the other hand, in optical systems, because of the square-law detection effected at the receiver, there are some signal-dependent noise components, so that the optical SNR (OSNR) depends on the signal level. As such, the BER of optical systems depends not only on the OSNR, but also on the signal, i.e. on the level of the received power. Even when an equivalent optical AGC is used, the BER performance of the system is not completely independent of the received optical power. The present invention accounts for this dependency by effecting all measurements of BER and OSNR for the same power of the signal.
There are several test instruments available for measuring the extent of signal degradation using an eye closure diagram. An eye closure diagram is the graphic pattern produced on an oscilloscope when a baseband signal is applied to the vertical input of the oscilloscope and the symbol rate triggers the instrument time base. For a binary signal, such an eye diagram has a single eye which is open or closed to an extent determined by the signal degradation. An open pattern is desired. Changes in the eye size indicate inter-symbol interference, amplitude irregularities, or timing problems, such as jitters, depending on the signal that is measured.
For example, U.S. Pat. No. 4,823,360 (Tremblay et al., issued Apr. 18, 1989 and assigned to Northern Telecom Limited) discloses a device for measuring chromatic dispersion of an optical fiber based on a baseband phase comparison method. The device described in this U.S. patent evaluates the transmission link performance using three threshold levels for recovering data. Two of the thresholds are obtained by measuring the level of "long 0s" and "long 1s", respectively, for a preset error rate, and the third threshold is provided in a selected relationship to the other two.
U.S. Pat. No. 4,799,790 (Tsukamoto et al., issued Jan. 24, 1989 and assigned to Anritsu Corporation) discloses a device comprising a transmitter for launching signals of various wavelengths into a reference or test fiber, and a receiver. At the receiver, the phase difference between two adjacent wavelengths is measured for both the reference and test paths for determining the delay of the respective wavelength.
None of these U.S. patents is concerned with providing a distortion measurement procedure that is simple to implement and gives a reliable measure of the contribution of the distortion to the performance.