1. Field of the Invention
The present invention relates to an optical transmission system and, more particularly, to a system that achieves optimization of the optical transmission wavelength by incorporating the effects of negative fiber dispersion into the wavelength selection criteria.
2. Description of the Prior Art
With the recent advances in high-speed electronics, 10 Gb/s transmission is becoming an attractive technology for increasing optical transmission system capacity. When using a 1.55-.mu.m wavelength chirp-free transmitter, the transmission distance at 10 Gb/s is limited to about 60 km of standard single-mode fiber (SMF), as a result of fiber chromatic dispersion. When using a transmitter wavelength of 1.3 .mu.m, fiber chromatic dispersion has been considered as non-critical and system loss is instead attributed to factors such as optical fiber loss and receiver/regenerator sensitivity (although these limitations may be overcome by using amplifiers in the system).
However, as data rates continue to increase, it has become apparent that fiber chromatic dispersion at 1.3 .mu.m will become problematic, requiring ever-increasing amounts of transmitter power, even for short-haul (e.g., 50 km or less) applications.