The origins of mobile radio telephony extend as far back as the early 1920's when the Detroit Police Department instituted a police dispatch system using a frequency band near 2 Mhz. This early system was such a huge success that the channels in the allocated bandwidth were soon filled to capacity. It quickly became necessary for the Federal Communications Commission (“FCC”) to open additional channel capacity. In 1934 the FCC responded by opening up channel capacity in the 30–40 MHz range. By the early 1040's, a large number of law enforcement and emergency agencies were utilizing mobile radios. In the late 1940's, the FCC made mobile radio service available to the private sector.
These early systems were based on a single, high-powered transmitter-receiver servicing a single geographic area. Each channel within the system could only support a single conversation at a time. With the popularity of the service and the limited number of channels available for a given area, the quality of service was not acceptable—especially in the law enforcement and emergency service sectors. Finally, in the 1970's, the FCC in cooperation with industry leaders, developed a system architecture which gave birth to today's cellular telephony systems.
A cellular telephony system is a high-capacity, mobile radio system in which the frequency spectrum is divided into discrete channels which are assigned in groups to small geographic regions. A cellular transmitter-receiver within a geographic region communicates with cellular radios within the same geographic region using the discrete channels assigned to that geographic region. The key aspect of cellular telephony systems is that the transmitted power of the signals on a cellular channel are limited so as to enable the re-use or reassignment of the cellular channels to another geographic region that is a minimum distance away from other geographic regions using the same cellular channels.
Today, several competing cellular telephony standards are in operation as well as development. Some of these systems include the Analog Mobile Phone System (AMPS), Narrowband Analog Phone System (N-AMPS), TDMA, GSM, CDMA, Edge, 3G, and PCS. Although the technology utilized in each of these systems can be quite varied, a common problem that arises is the optimization and layout of the cellular network.
A cellular telephony system divides a service area into a series of geographic regions or cells. Within each geographic region, a transmitter-receiver tower is established to cover that geographic region. Much research and testing has been performed in identifying optimal design for cellular systems. The utilization of bandwidth within a cellular system is maximized by maximizing the reassignment of the cellular channels within the system. However, reusing cellular channels without having enough geographic separation may result in co-channel interference. To minimize co-channel interference, the reassignment of cellular channels within a system must be minimized. Thus, there is a need in the art for a system and a method for optimizing the configuration of a cellular telephony system that balances the minimization of co-channel interference with the maximization of bandwidth utilization.
In an ideal situation, the most optimal structure is to use hexagonal shaped cells that have an axis included to each other at a sixty degree angle. Given particular cell sizes and transmit powers for each transmitter-receiver, the distance necessary to separate cells that utilize the same set of cellular channels can easily be calculated. However, once you step away from the chalk board and enter the real world, one that is plagued by buildings, foliage, humidity, uneven terrain, and a host of other parameters, the chalk board calculations don't always provide optimum performance of the cellular telephony system. It would be exceedingly difficult to attempt to optimize the layout of a cellular system based on each of the possible parameters that effect its operation on paper. Thus, to optimize the layout of a cellular system, it is necessary to take signal measurements in the field. However, this can also be a tremendous task depending on the size of the cellular system, the terrain, and the resources available to the system operator. Thus, there is a need in the art for a system and a method to simplify the task of obtaining and analyzing field signal measurements of a cellular system.
As previously mentioned, optimizing a cellular system includes limiting co-channel interference. A problem associated with signal measurements taken in the field is distinguishing between valid channels and interfering channels. If the source of a signal cannot be identified (i.e., the transmitting cell tower) then the determination of co-channel interference cannot be accomplished. Thus, there is a need in the art for a system and a method for identifying interference problems due to co-channel interference within a cellular telephony system.
A current technique that is being employed by service providers of cellular systems includes performing a drive test within the footprint of the cellular system to measure the received signal strength at various locations within the cellular system. In addition, the service provider predicts the performance of the cellular system using a network model, typically based on mathematical analysis. Invariably the measured and predicted performance characteristics of the system are different. The service providers then perform adjustments to the system to improve the performance. They utilize these adjustments as inputs into the performance prediction process to determine the improvements in the performance of the cellular system. Any performance improvements identified during this analysis are assumed to be proportionately attributed to the measured performance of the system. This type of system is very prone to error. The average error for systems utilizing similar methodologies range from 9 to 12 dB. While this is marginally acceptable for a non-operating network, it is completely unacceptable for a system that is currently in operation. GSM networks, in normal operation, involve operation changes which may affect the network performance. Despite this, this corrected predicted data is still the main input for the frequency planning and capacity maximization process.
Thus, although this technique may result in providing some performance enhancements to the cellular system, the improvements are uncertain, unverified, and inaccurate. Thus, there is a need in the art for a system and method to more accurately ascertain the actual operational characteristics of a cellular system before and after performing optimization adjustments.
Dedicated communication channels within a GSM system are managed through a time division multiplexing technique. The GSM standard defines traffic channels (TCH) that are used to carry information intended for a user. Each traffic channel is associated with another channel used for signaling. It also is a dedicated channel is called the slow associated control channel (SACCH). Three broadcast channels are available in the GSM system. The broadcast control channel (BCCH) is used to send various system parameters to all mobile stations. These parameters include the operator identifies, the location of the cell, the name of the cell, frequency information, and the like. The frequency correction channel (FCCH), is used by the base station to give the mobile station information about frequency references and is used for a frequency correction burst. The synchronization channel (SCH) is used by the base station to provide the mobile station synchronization training sequences. Further details regarding the GSM specification are disclosed in the detailed specification.
Therefore, it is apparent that there is a need in the art for a system and a method for analyzing the efficiency of the current configuration of a cellular system and identifying optimization changes for a cellular telephony system to maximize the bandwidth and minimize the co-channel interference.