Current mobile radio communication systems rely on sophisticated radio resource management algorithms to maximize their performance in terms of capacity, coverage, and network stability. System designers generally employ computer-based simulation techniques to estimate the benefit of specific algorithms prior to implementing them in an actual system. However, since a mobile radio system involves multiple transmitters and receivers interacting with each other, it is difficult to predict the performance gains of some of those schemes in an analytical manner.
One widely known current source of information on system-level simulations of mobile radio systems is a technical report of the third generation partnership project (3GPP), that contains the basic methodology for static snapshot-based simulation of wireless systems. The term “static” means that modeling of dynamic effects due to movement of users, call arrival and departures is not attempted. Rather, simulation of possible realizations of the system configuration in terms of user placement (“i.e. snapshots”) is performed at specific instants of time. In each snapshot, the transmission power requirements of each user are computed by iterative power balancing where the mutual interference between users is modeled. It is then found whether or not users can sustain a viable connection; for example, if there is a sufficient signal-to-interference ratio (SIR). If not, those events are recorded for statistical analysis. These simulations also permit extraction of other statistics, such as distributions of transmission power, interference levels, etc. The accuracy of those statistics improves as the number of simulated snapshots increases.
There are several radio resource management algorithms that are used in the prior art. For example, those algorithms that are responsible for the user-to-timeslot allocation, (also known as fast dynamic channel allocation (F-DCA)), are particularly critical to the performance of time slotted communication systems. Although some aspects of the prior art methodology are generally applicable to the simulation of time division duplex (TDD) systems, this methodology falls far short of what is required to evaluate the performance of measurement-based F-DCA algorithms.
Measurement-based F-DCA algorithms base the timeslot allocation or re-allocation decision for a given user on interference, received power (path loss) and transmission power measurements performed by the mobile unit and its serving base station in all candidate timeslots. When the performance of an F-DCA algorithm is simulated, prior to each invocation the program must provide the simulated F-DCA algorithm with the interference and transmission power levels that would be reported by the relevant nodes of the system. Additionally, all users are allocated a channel before the start of the power balancing procedure. However, those levels are not available before the power balancing procedure is executed. Since the interference and transmission power levels are not available prior to the channel allocation, this type of methodology fails to perform any meaningful validation of an F-DCA algorithm.