The explosive growth of data communications and database management has generated a boom in interactive, digital informational and transactional services. With the aid of the public telephone network, their personal computers and modems, literally millions of subscribers can interactively access specialty databases to obtain news, information, entertainment, and shop for goods and services of all types.
The PRODIGY.RTM. Interactive Service, available from the Prodigy Services Company of New York is a leading offering in this field. It furnishes a host of informational and transactional features that bring the benefits of the information revolution to contemporary life. From the convenience of their homes and offices, subscribers can obtain headline news concerning current events, business, sports, science, politics, finance and health simply by calling up and selecting corresponding headings displayed at their computer screens. In addition, more in-depth information on these and others subjects can be obtained together with expert commentary and analysis by accessing a variety of departments and categories that even include an electronic encyclopedia containing well over 35,000 articles.
Additionally, subscribers can conveniently shop for a wide variety of goods and services. Subscribers can look over and select such items as clothing, sporting goods, foods, consumer electronics, computers, cars, real estate and others. Further, they can obtain services such as electronic banking, bill paying, loan transactions, stock brokerage, airline ticketing, lodging reservation, car rental, insurance coverage, legal advice, and, education courses from an electronic university. Still further, subscribers can also use the Service recreationally to interactively play games, pursue pastimes such as gardening, cooking, travel and electronically correspond with other subscribers over the Service bulletin boards.
Such a sweeping range of offerings and ease of use have produced broad acceptance of interactive services and, in the case of PRODIGY, generated memberships well in excess of a million and a half in the four years since its public offering in late 1988. Success in providing service utility, however, has stimulated subscriber appetite for even more comprehensive performance. Once having seen how helpful and entertaining such a service can be, subscribers now seek service features containing even more information, enhanced with yet greater amounts of graphics. And, subscribers want these information-enriched features supplied to them with still less wait time between submission of the request at their keyboards and display of the feature at their computer screens.
Regrettably, however, there are constraints on the degree to which service providers can enhance their offerings. Inherent limitations in the public telephone network restrict the amount of information per screen that can be supplied practically to subscribers. More specifically, bandwidth restrictions inherent in the telephone network limit the rate at which digital data can be transmitted using the public phone lines. Accordingly, service providers are limited in the amount of information they can present per screen and still maintain acceptable wait times between screens.
In 1875, when Alexander Graham Bell first conceived his telephone system, the objective, while wondrous for the times, was merely to enable people to talk with one another. To that end, he proposed to convert the sound waves produced by a caller's speech into analogue electrical signals having amplitudes and frequencies corresponding to those sound waves, and to thereafter, transmit the analogue, electrical signals over ordinary circuits to a listener where the signals would be converted back into sound waves that matched the caller's speech. In implementing this system, the circuits of wires, cables and amplifiers developed to connect callers and listeners, i.e., the public telephone network, were provided with only a limited capacity for carrying electrical signals so as to keep costs low enough to allow for mass usage, the anticipated key to success. Specifically, the designers provided the telephone circuits with only enough electrical bandwidth to transmit a portion of the vocal range sufficient to enable parties to the conversation to understand one another. Specifically, public phone lines evolved over time to have an electrical bandwidth of approximately 3,000 Hz, lying between 300 Hz and 3,400 Hz. In the early days, no one foresaw the coming digital communication revolution and its needs.
Digital data signals; i.e., the type of signals required for interactive services, on the other hand, are not suited to transmission over limited bandwidth lines. Because digital signals are composed of pulse trains that inherently contain a much broader range of frequencies, they are subject to substantial distortion when sought to be transmitted over phone lines. The limited bandwidth of the telephone lines cut off the higher frequency and d.c. components of digital pulse trains causing the pulses that make up the trains to smear out and run together with resulting loss of the digital information.
While innovation has enabled ordinary phone lines to carry digital data, still, significant limitation in data throughput remain. In the 1960's, modems (modulator/demodulator) were introduced that permitted digital data to be sent over public telephone lines. In accordance with the design, a modem is connected to the sending and receiving ends of the line. At the sending end, the modem is arranged to accept trains of pulses from the digital equipment; e.g., a computer, the trains of pulses including 1 and 0 bits of the digital data in the presence and absence of the pulses. The modem then successively converts the 1 and 0 bits of the pulse trains into two distinct and corresponding analog electrical signals, one representing the 1 bits and the other representing the 0 bits. This is done by modulating one or more of the characteristics; i.e., amplitude, frequency, or phase of a first carrier; e.g., 1,200 Hz sine wave. At the receiving end, the other modem detects the two distinct analogue signals resulting from modulation of the first carrier and reproduces digital signals; i.e., trains of pulses identical to the original, and supplies the pulse trains to the digital equipment at the receiving end of the line; e.g., another computer.
In addition, to enable simultaneous, bi-directional data transmission, a second carrier; e.g., 2,400 Hz; is provided for so that two further signals spaced in frequency from the first two can be generated that permit the receiving modem to concurrently transmit 1 and 0 bits of digital information to the sending modem, so called "full-duplex" operation. By manipulating the amplitude, frequency and phase of the first and second carrier signals, a number of distinct states for each can be defined that enable digital data to be bidirectionally transmitted at rates of up to 14,400 bits per second over the phone lines.
But, even at the data transmission rates of 14,400 bits per second, traditional telephone lines are limited in their ability to supply the screen data necessary to sustain an information-enriched interactive service. For example, it takes approximately 2 seconds for a contemporary microcomputer (386/25 IBM-compatible machine) at 14,400 bps to receive and display the screens typical of an interactive service such as PRODIGY. However, these screens are predominantly character based, and contain on the order of only 16K data bits per screen. Where such screens are supplemented with graphics, say for example where half the screen includes 16 color, VGA images, the screen bit count rises dramatically and can include on the order of 600K bits per screen. And, with that increase in data, the time between received screens can rise to over a minute and a half. Further, where the desire is to present full screen VGA color graphics, the time between screens can balloon even more dramatically to almost three and a half minutes, a commercially unacceptable delay. Still further, the increase in required data throughput and associated delay becomes even more pronounced where enhancements of sound and motion are contemplated.
While certain strategies have been developed to reduce the effects of line delay; for example, caching data at the subscriber's computer to avoid the need to transmit data from the service host each time a screen is requested, a technique pioneered by PRODIGY, due to limitations on available local storage and the continuing call for new applications and associated screens by a subscriber, the service still must supply a substantial amount of data over the phone lines to support an interactive session. As a result, the attendant delay associated with conventional phone lines remains an obstacle to providing information-enriched service screens.
In the past, engineers have sought to overcome problems of transmission delay and limited storage associated with the handling of large amounts of digital data. However, the developments resulting from those efforts have not been well suited in their original forms to interactive services. Previously, workers in the digital communication field found that the effects of transmission delay and limited storage could be significantly reduced by compressing the data prior to transmission and storage, and subsequently expanding; i.e., decompressing, the data at the time of use. At an early point, it was found that information streams such as text and graphics tended to be both predictable and repetitive. Further, it was recognized that where the appearance of data strings within the information streams could be statistically predicted, representative code words having lengths dependent on the probability of appearance of the strings but, less in length on average than the original strings, could be created and substituted so as to decrease the amount of data required to be actually transmitted and stored.
Additionally, it was recognized that where strings of data were repetitive, reduced-sized codes could be substituted using specially prepared dictionaries. Typically, the dictionaries included code words of fixed length that were dependent on the number of entries in the dictionary but, less in length than select repetitive strings. Further, in some implementations, the code words were indexed to prior appearing strings so they could be substituted as the strings repeated in the data stream. In this way, the data actually transmitted and stored, again, could be reduced.
Still further, workers found that both approaches could be rendered even more effective, in certain cases, if the respective statistics or dictionaries were adjusted during the compression process to more accurately reflect the composition of the data stream being compressed. This adjustment process or modeling of the data source on the fly has become known as adaptive compression, and is typically found to provide better results than fixed compression; i.e., compression where the encoding statistics or dictionaries are set for the system at some initial time and thereafter not adjusted. As might be expected however, this improvement in performance is not without a price. The complexity associated with adaptive compression invariably requires additional computing resources and storage space be provided to handle the more elaborate algorithms and retain the results. Additionally, adaptive approaches, particularly as implemented in dictionary-type systems are found to be less effective where the data sample size is small; e.g., less that 500 bytes.
Regrettably however, because of constraints in the environment in which interactive services such as PRODIGY must operate, traditional statistical and dictionary compression, whether adaptive or fixed, have not been well suited to solving the problems presented by information-enriched features.
In the case of statistical compression, while source analysis can lead to symbol predictability that provides significant degrees of compression; e.g., compression ratios on the order of 0.5 and better, this approach demands significant computing resources; e.g., CPU and RAM, to systematically inspect the data stream, build frequently statistics and generate the unique abbreviated code words needed for compression. Additionally, significant amounts of disk storage space are required to hold the required program code and analysis results. Still further, time is required to process the compression and decompression algorithms which adversely affect screen presentation performance given the type of equipment commonly used by an interactive service subscriber.
Additionally, while dictionary techniques typically can also provide significant compression and do so even faster and with less demand on CPU and RAM elements, still, they require significant amounts of disk storage space. In the case of dictionary compression, while table look up of code words rather than computation is relied on, thus reducing the CPU and RAM time and requirements, still, to provide meaningful coverage of the symbols in the data stream, significant table size is needed. Accordingly, disk storage space can be significant. Further, dictionary techniques also require some computing resources to manage their algorithms.
Still further, where either statistical or dictionary compression is sought to be made adaptive, yet additional amounts of CPU, RAM and disk storage resources are required, and may even call for use of specialty processing and storage facility in the form of dedicated hardware to implement the method steps. And, where dictionary look up is done adaptively; i.e., where a block of previously transmitted data becomes the dictionary looked to see whether the current data is repeating, as noted, the process becomes less effective as the block size becomes smaller; e.g., on the order of 500 bytes or less, as is typically of interactive services such as PRODIGY.
In view of their resource-intensive character, conventional data compression systems have not been found well suited to interactive service such as PRODIGY. In order to make its features available to as wide an audience as possible, PRODIGY was designed to work with the microcomputer systems commonly found in today's homes and offices. These system include not only the latest in microcomputer technology, but also, substantial numbers of lower-cost, older computer units; e.g., IBM and IBM-compatible units having Intel 8088 and 80286 microprocessors, or comparable Apple systems, that lack the comprehensive computing power and storage resources of state-of-the-art equipment. And, in these older units, the resource demands of conventional statistical or dictionary compression would either not be met or would severely burden system performance. Additionally, because the PRODIGY Service is intended to run on ubiquitous systems at minimal cost, it would be commercially impractical to use specialty add-in boards or other compression hardware that might otherwise prove of value.