Conventional telecommunication networks frequently comprise a lightwave transmission system designed to carry modulated light energy containing audio and/or video signals, or coded digital information. For example, current cable television (CTV) networks usually contain a lightwave transmission system designed to transmit conventional CTV signals over optic fiber lines. In many of these cable networks, the CTV signals are directly modulated onto a laser beam while an optical fiber network carries the modulated laser beam to a distribution point where the CTV signals are then put onto a coaxial transmission line for delivery to subscribers.
Many conventional CTV systems can transmit signals having over one hundred separate video channels, often starting at about fifty megahertz (MHz) and ending at about 750 MHz. Such CTV systems generally include a laser circuit in which CTV signals modulate a laser via a radio-frequency (rf) drive circuit. Since a typical laser's transfer function is not completely linear, a modulated laser beam will usually contain an extremely large number of distortion products. These undesired signal distortions can substantially degrade the ultimate television pictures.
It is generally well known that when an active laser in a lightwave transmission system is properly biased and not driven too hard, the nonlinear portion of that laser's transfer function, i.e., the distortion terms, will be primarily contained in second- and third-order terms. Skilled artisans usually refer to the second-order terms as the "composite second-order" (CSO) terms, and the third-order terms as the "composite third-order" or "composite triple-beat" (CTB) terms. The CSO and CTB terms are essentially noise-like signals that can be measured in a prescribed channel, and can be referenced to the carrier in that channel. One apparent way of reducing the magnitude of these CSO and CTB terms is to reduce the strength of the rf drive signal applied to the laser. However, this technique is not generally suitable because reducing the level of an rf drive signal will normally reduce the carrier-to-noise (C/N) ratio which can also degrade the quality of a television picture.
One suitable prior art solution for reducing the effects of CSO and CTB distortion products in a lightwave transmission system involves the use of a predistortion technique in which a predistorter is placed in the transmitter before the laser. The predistorter functions to add a controlled amount of distortion to an rf drive signal that modulates the laser. If the magnitude of the added distortion introduced by the predistorter is the same as that produced by the laser, and the phase of that added distortion is 180 degrees different from that of the distortion produced by the laser, the added distortion will effectively cancel the distortion produced by the laser.
Thus it has been the general practice in lightwave transmission systems to employ predistortion techniques that drive out-of-limit distortion levels to acceptable levels, or that increase the magnitude of rf drive signals to improve the C/N ratio while still keeping unwanted distortion levels within specified limits. However, one of the most critical problems confronting designers of lightwave transmission systems has been optimizing their performance without increasing power consumption and/or costs. The present invention addresses this problem.