1. Technical Field
The present invention relates, in general, to a method and system to be utilized with wireless communications systems having cellular architectures. In particular, the present invention relates to a method and system, to be utilized with wireless communications systems having cellular architectures which utilize Code Division Multiple Access (CDMA). Yet still more particularly, the present invention relates to a method and system, to be utilized with wireless communications systems having cellular architectures which utilize CDMA, and which increase the reliability of such wireless communications systems by avoiding communication failure.
2. Description of the Related Art
The present invention is related to wireless communication systems, and, in particular, to wireless communications systems which have both a cellular architecture (e.g., cellular telephony, personal communications systems) and which utilize CDMA (or similar technologies). Wireless communication refers to the fact that transmission between sending and receiving stations occurs via electromagnetic radiation not guided by any hard physical path (e.g. by microwave link). Cellular architecture refers to the fact that the wireless system effects service over an area by utilizing a system that can be (ideally) be pictographically represented as a cellular grid. CDMA stands for Code Division Multiple Access, which is a type of spread spectrum technology, originally developed for military application and thereafter adapted for civilian use.
Wireless cellular communication utilizing CDMA is the latest incarnation of a technology that was originally known as mobile telephone systems. Early mobile telephone system architecture was structured similar to television broadcasting. That is, one very powerful transmitter located at the highest spot in an area would broadcast in a very large radius. If a user were in the usable radius, then that user could broadcast to the base station and communicate by radio telephone to the base station. However, such systems proved to be very expensive for the users and not very profitable to the communication companies supplying such services. The primary limiting factor of the original mobile telephone systems was that the number of channels available for use was limited due to severe channel-to-channel interference within the area served by the powerful transmitter.
Counter-intuitively, engineers discovered that channel-to-channel interference effects within the service area were not related solely to the distance between stations communicating with a base station transmitter (which intuitively would seem to give rise to the interference), but were also related inversely to the usable radius of the roughly circular area being served by a base station transmitter. Engineers found that by reducing the radius of an area being served by a base station transmitter by a significant percentage, channel-to-channel interference effects were reduced such that a significant increase in the number of additional usable channels could be provided. For example, it was found that a system based on an area being served by a base station transmitter with a one-kilometer useable radius would have 100 times more usable channels than a system based on an area being served by a base station transmitter with a 10-kilometer useable radius.
Reducing the power of the central transmitter allowed a significant increase in the number of available channels by reducing channel-to-channel interference within an area. However, as the power of the central transmitter was reduced, the serviceable area was also reduced. Consequently, although reducing transmission power increased the number of available channels, the small service area provided by such reduced power did not make such radio telephone systems attractive communication options for many users. Thus, a problem arose relating to how to utilize the discovery that smaller area size (or, equivalently, reduced transmitter power) increased the available channels such that radio telephone systems based on such smaller areas would be commercially viable.
This problem was solved by the invention of the wireless cellular architecture concept. The wireless cellular architecture concept utilizes geographical subunits called "cells" and encompasses what are known as the "frequency reuse" and "handoff" concepts. A cell is the basic geographic unit of a cellular system. Cells are defined by base stations (a base station consists of hardware located at the defining location of a cell and includes power sources, interface equipment, radio frequency transmitters and receivers, and antenna systems) transmitting over small geographic areas that are represented (ideally) as hexagons. The term "cellular" comes from the honeycomb shape of the areas into which a coverage region, served via two or more base stations, is divided when the mathematically ideal hexagonal shape used to represent the usable geographic area of each of the two or more base stations. It is to be understood that, although the mathematically ideal shape of the cell is a hexagon, in practicality each cell size varies dependent upon the landscape (e.g., a base station transmitting on a flat plane will closely approximate the ideal hexagon, whereas a base station transmitting in a valley surrounded by hills will not closely approximate a hexagon due to the interference from the surrounding hills).
Within each cell a base station controller talks to many mobile subscriber units at once, utilizing one defined transmit/receive communications channel for each mobile subscriber unit with which communication is taking place. Each mobile subscriber unit (a control unit and a transceiver that transmits and receives wireless transmissions to and from a cell site) uses a separate, temporarily assigned transmit/receive wireless channel to talk to a cell site. Each wireless transmit/receive communications channel consists of a pair of frequencies for communication--one frequency for transmitting from the cell site base station controller to the mobile subscriber unit, named the forward link, and one frequency for transmitting from the mobile subscriber unit to the cell site base station controller, named the reverse link.
Wireless communication is regulated by government bodies (e.g., the Federal Communications Commission in the United States). Government bodies dictate what frequencies in the wireless spectrum can be used for particular applications. Consequently, there is a finite set of frequencies available for use with cellular communications. The frequency reuse concept is based on the assigning to each cell a group of radio channels to be used within the small geographic area (cell). Adjacent cells are assigned groups of channels that are completely different from the groups of channels assigned to neighboring cells. Thus, in the frequency reuse concept there is always a buffer cell between two cells utilizing the same set of frequencies. The cells are sized such that it is not likely that two cells utilizing the same set of frequencies will interfere with each other. Thus, such a scheme allows "frequency reuse" by non-adjacent cells. As mobile subscriber units transit adjacent cells, the mobile subscriber unit is "handed" from a cell it is leaving into a cell it is entering by directing the mobile subscriber unit to stop using frequencies appropriate to the cell it is leaving and to begin using frequencies appropriate to the cell it is entering. Thus, the cellular architecture concept, in conjunction with the frequency reuse concept, augmented by the operation of "hand-off" gave rise to the ability to utilize small cells to provide communications service over a large geographic area.
The first large-scale wireless communications system utilizing cellular architecture in North America was the Advanced Mobile Phone Service (AMPS) which was released in 1983. With the introduction of AMPS, user demand for bandwidth was initially low until users became acquainted with the power of the system. However, once users became acquainted with the power of cellular, the demand for the service increased. Very quickly, even the extended number of channels available utilizing the cellular concepts of reduced power output and frequency reuse were quickly consumed by user demand in certain geographic areas, and a problem arose with respect to capacity.
Engineers responded to the problem by devising the Narrowband Analog Mobile Phone Service (NAMPS). NAMPS utilizes frequency division multiplexing to transmit three transmit/receive channels in the same bandwidth wherein AMPS had previously only transmitted one transmit/receive channel. Thus, NAMPS essentially tripled the capacity of AMPS. However, even though NAMPS essentially tripled the capacity of AMPS, the extended number of channels available with NAMPS were quickly consumed by user demand in certain geographic areas, and a problem again arose with respect to capacity.
Engineers responded to this new problem by devising Digital AMPS (or DAMPS, also known as TDMA). In DAMPS/TDMA time division multiple access techniques are utilized to multiplex user data together. Furthermore, digital data compression techniques are utilized at both the transmission and reception ends. These techniques give rise to increased capacity, and clarity, even exceeding that of NAMPS. However, as was the case with both AMPS and NAMPS, the increased bandwidth capacity of DAMPS/TDMA has been quickly consumed by user demand in certain geographic areas.
Subsequent attempts to increase cellular telephony bandwidth capacity tended to be variations on the foregoing described themes. However, it became apparent that some new communications technology would be necessary to give rise to any significant increase in bandwidth beyond that available with the foregoing described technologies. It was decided within the industry that such new technology would be standard CDMA, which stands for Code Division Multiple Access.
Notice that in all the foregoing described technologies, the method of using multiple transmit/receive channels with each such transmit/receive channel utilizing a different pair of frequencies was maintained throughout. Standard CDMA breaks completely with this method of communication.
Standard CDMA utilizes cellular architecture and a type of hand-off. However, in standard CDMA, transmission and reception is done by all users on the same frequency. Standard CDMA is able to achieve this feat by insuring that the signals from different users are adjusted such that the signals do not interfere with each other to the point of being unable to understand the messages from the different users.
The way in which standard CDMA works is somewhat analogous to a situation in which two English speaking persons are communicating in a room wherein many other non-English speakers are also communicating in a language which the two English speakers do not understand. Since the two English speakers do not understand the language spoken by the non-English speakers in the room, the conversations of their non-English-speaking counterparts will be interpreted by the two English speakers as meaningless "noise." Consequently, since the English speakers will attach no meaning to the "noise," the English speakers will be able to disregard the "noise" and continue to engage in their conversation provided that they both speak loudly enough so that each can be understood by the other despite the "noise" generated by their non-English-speaking counterparts. This is true even though all persons in the room are talking, or communicating, in the same band of sound frequencies which the human ear can hear.
Standard CDMA is able to achieve the same affect by modulating the signal of each user within a particular cell with a "pseudo-noise" code which, in effect, will make each user in the cell appear as if each user were, in effect, "speaking a different language," thereby insuring that the meaning of a signal generated by one user within the cell will not be drowned out by the meaning contained within the signal generated by one or more other users in the cell. Provided, of course, that each user speaks "loudly" enough (or transmits enough power) to be understood over the "noise" generated by the other users in the CDMA cell.
Standard CDMA utilizes digital data technology to achieve the foregoing. Standard CDMA utilizes complex digital codes to modulate user data prior to transmission within a cell. The standard CDMA pseudo-noise codes are chosen such that a modulated signal, when transmitted upon a carrier frequency within the cell, approximates white (or Gaussian) noise, and does not greatly interfere with any other signal transmitted upon the same carrier frequency within the cell. Upon reception, a similar pseudo-noise code is used to demodulate the signal and recover the data that was transmitted.
When digital data technology is utilized with the standard CDMA pseudo-noise codes, it is necessary for all transmitters and receivers within a cell to be synchronized to the same digital clock. This synchronization is provided by use of a "pilot" signal which is transmitted by the base station. Each mobile subscriber unit within a cell "locks" to this pilot signal and thereafter utilizes it as the clock signal for digital data-processing.
In standard CDMA, each base station transmits and receives on the same carrier frequency. Furthermore, in standard CDMA, each base station transmits the same period digital code which is utilized as the pilot signal within each cell. ordinarily, such a situation would give rise to severe interference between cells. Standard CDMA avoids this problem by phase-shifting (or time-staggering) the pilot signal, or digital code, transmitted within adjacent cells. Within standard CDMA, the carrier signal, pilot code, pseudo-noise codes, and phase-shifting (or time-staggering) of the pilot codes utilized in adjacent cells have all been chosen to work together such that inter-cell interference is minimized. Thus, not only does standard CDMA ensure that users in each cell appear to each other as if they are "speaking different languages," but standard CDMA ensures that adjacent cells appear to each other "as if" each cell was in fact "speaking a different language."
Although at first examination, it appears that standard CDMA can provide virtually unlimited bandwidth, in actuality standard CDMA has provided an increase in capacity of only roughly 13 times that of AMPS. The primary reason for this is known in the art as "the near-far" problem. Returning to our analogy of two English speakers communicating in a room full of other nonEnglish speakers, it is apparent that there is a practical limit with respect to how far apart the English speakers can be and still communicate. That is, due to the "noise" produced by the non-English-speaking persons in the room, there is a practical limit on how far apart the English speakers can be and still be understood by each other. This practical limit is dependent upon both the "noise" in the room and the "volume" which can be generated by each English speaker. That is, when the English speakers are "near" each other, they can communicate with relatively low "volume" (or "power" output), but when they are "far" from each other they can only communicate with relatively high "volume" (or "power" output).
A similar "near-far" problem exists in standard CDMA for roughly the same reasons. That is, the "noise" of the other users in a cell gives rise to a requirement that the mobile subscriber units in the cell increase their power outputs dependent upon both noise in the cell and the distance from the base station transceiver. This "near-far" problem thus puts a practical limit on the bandwidth available in standard CDMA, which has been found empirically to have a practical upper limit of 13 times AMPS.
As was the case for original cellular, AMPS, and DAMPS/TDMA, the additional bandwidth provided by standard CDMA is being quickly consumed by users. Consequently, newer CDMA systems are being developed to provide users with additional bandwidth. However, as will be shown in the detailed description, such newer systems, in certain instances, tend toward communication failure. It is therefore apparent that a need exists for a method and system which increase the reliability of such wireless communications systems by avoiding communication failure in the instances identified.