Until the 1970's computer manufacturers often designed equipment which operated only in conformance with their own proprietary specifications. This approach was adequate when most computer installations consisted of large central processors performing local batch processing and/or time sharing.
By the 1980's, however, distributed processing by mini-computers and especially personal computers had become quite popular. Users of such system quickly realized that it was no longer easy for them to share data with this arrangement, especially between different types of computers. With a need for a common high-speed data communication protocol being evident, the Institute of Electrical and Electronic Engineers (IEEE) began Project 802 to develop an Open Standard Interconnect (OSI) specification which would be available for use by all computer manufacturers. The OSI would enable efficient inter-communication between computers connected in a local area network (LAN), without the need for end users to worry about implementation details that might vary from manufacturer to manufacturer.
Two conclusions were quickly reached in the course of the IEEE's project. First, getting different computers to communicate is a complex problem because of the diversity of their design. Solving that problem requires architecture decisions not only at low levels, such as the physical wiring and modulation scheme, but also at higher levels, such as computer's operating system. The IEEE thus developed an OSI LAN reference model having three "layers". A first, physical layer is concerned with the nature of the transmission medium. A second layer, called the media access control (MAC) layer, specifies the details of signalling along the physical layer. A final layer, called the logical link control (LLC) layer, is concerned with establishing, maintaining, and terminating links between devices.
The other conclusion reluctantly reached by the IEEE was that no single physical-layer architecture would be ideal for all situations. Performance can be sacrificed for lower cost in some applications, such as the typical office environment. The IEEE represents its 802.3 Standard, also known as Ethernet, as ideal for that application. In critical environments such as factories, users will spend more money to have a network which is more robust. An IEEE 802.4 token-passing standard was developed for those applications.
Several manufacturers presently provide modems compatible with the IEEE 802.4 standard. However, these first-generation modem designs do not perform ideally in all situations, especially because of the stringent signaling requirements of the physical layer, which dictate high-speed transmission, and low bit error rates.
In order to provide functions such as loopback typically required in a data modem, the modem transmitter and receiver need to be phase locked. The problem of phase synchronism is exacerbated by the fact that the MAC layer cable between the modem and the upper-layer controller can be of varying lengths, introducing unpredictable delays over that interface.
The encoding function of the transmitter presents difficult design requirements to avoid ground-loop problems, since the digital portion of the modem transmitter is often physically removed from the analog portion. The more signals are passed between these portions, the more difficult it is to obtain the desired noise immunity. Additionally, it is desirable to minimize the number of input/output signal pins on application-specific integrated circuits (ASICs) used to implement the digital portion of the modem.
Several signalling problems arise at the modem receiver as well. One is that of insuring that of the receiver is phase- and frequency-synchronized to a receiver clock with its automatic gain control circuit operating correctly before the higher-order MAC and LLC layers are allowed to begin communication.
A related problem is that certain existing automatic gain control (AGC) circuits are particularly susceptible to noise corruption while the receiver is being synchronized.
The receiver should also monitor itself so that it will automatically attempt to relock without requiring periodic status interrupts from the controller.
Another problem occurs in the operation of the AGC itself. Some prior designs, although having the ability to quickly and accurately set the proper AGC level in most instances, could not reach the proper level under noisy conditions. Such designs typically first set the AGC to a maximum gain level. In a first, coarse-adjustment mode, the AGC's gain level is continuously reduced until a threshold point is reached. Then a fine-adjustment mode is enabled where the level of received symbols having a nominal level corresponding to the maximum possible correct receiver level are tested against a threshold. If these are detected to be low, the AGC gain is increased. A problem with this approach, however, is that it does not allow pulling the AGC level back up if the coarse adjustment has reduced it too much, as may be caused by a noise pulse, since the highest symbol level will never be detected.