The Universal Mobile Telecommunication System (UMTS) is one of the third generation mobile communication technologies designed to succeed GSM. 3GPP Long Term Evolution (LTE) is a project within the 3rd Generation Partnership Project (3GPP) to improve the UMTS standard to cope with future requirements in terms of improved services such as higher data rates, improved efficiency, lowered costs etc. The Universal Terrestrial Radio Access Network (UTRAN) is the radio access network of a UMTS and Evolved UTRAN (E-UTRAN) is the radio access network of an LTE system. In an E-UTRAN, a user equipment (UE) 150 is wirelessly connected to a radio base station (RBS) 110a commonly referred to as an eNodeB or eNB (E-UTRAN NodeB), as illustrated in FIG. 1. The eNBs 110a-c are directly connected to the core network (CN) 190.
Radio Resource Management (RRM) plays a crucial role in how resources in a wireless communications system are used. In particular, RRM techniques in wireless communications systems are of high importance as they largely influence how efficiently the system is used. Two RRM functionalities, scheduling and Link Adaptation (LA), play a central role for resource allocation and have a significant influence on system performance. These two RRM functionalities work tightly together. The scheduling allocates a certain part of a spectrum, i.e. of the available frequency resources, to a certain UE during a certain amount of time. The LA computes how many bits that may be transmitted in the scheduled part of the frequency resource given operating channel conditions, a transmit power and a desired probability of a correct reception.
The scheduling and LA are used in a way that optimizes a frequency resource utilization in every cell separately. Other RRM functionalities promote the coordination between different cells, and are also very important for a good wireless communications system performance. For instance, schemes that try to mitigate and coordinate interference among different cells—commonly referred to as Inter-Cell Interference Coordination (ICIC) schemes—constitute one of the most intriguing areas in RRM. ICIC schemes try to coordinate a generated inter-cell interference between cells so that the effect of the generated interference becomes less detrimental, typically by utilizing feedback and exchanging information between neighboring radio base stations. ICIC schemes usually work on a slower basis than the scheduling and LA in order to mitigate the increased overhead and complexity arising from the extra information exchange, signaling, and processing needed for ICIC.
A main operating principle in conventional scheduling and LA is to transmit as much data bits as possible given a certain frequency resource allocation, or expressed in another way, to find a smallest possible frequency resource allocation given a certain number of data bits to transmit. At the same time, a certain probability of correct reception under the operating channel conditions should be satisfied. A commonly used criterion for the probability of correct reception is a Block Error Rate (BLER) target. The main operating principle is thus to maximize the spectral efficiency measured in bits per second and per Hz (bps/Hz) for the allocated resources. The more bits that may be transmitted over a certain part of the frequency resources over a fixed amount of time, the higher the spectral efficiency will be.
The spectral efficiency measure is without doubt a very important performance measure. However, the measure is mainly significant in case of fully loaded wireless communications systems. In other words, if the system is always fully loaded, i.e. if there is at least as much traffic to serve as the radio resources may support, then a higher spectral efficiency will lead to a better utilization of the resources as more UEs may be served. However, wireless communications systems are seldom fully or even highly loaded. Measurements from networks in operation show that only a fraction of the frequency resources are utilized most of the time and that all traffic may be served using just a portion of the available spectrum, with the exception for traffic in high density areas at peak hours. Most of the time UEs will be scheduled in a part of the frequency bandwidth only, whereas other parts of the frequency bandwidth will be free from transmissions, as illustrated in FIG. 2a. Frequency resources allocated to three UEs, UE1, UE2 and UE3, in a given scheduling interval only sums up to a frequency resource utilization of around 50% of the total frequency bandwidth, and the rest of the frequency resources 20 are unutilized. Such a scenario has two main limitations:                1. By scheduling with a high spectral efficiency, the Signal to Interference and Noise Ratio (SINR) requirement will be strict in order to support the efficient high order Modulation and Coding Scheme (MCS).        2. By transmitting on just a part of the bandwidth, while leaving other parts of the bandwidth without transmissions, the inter-cell interference will vary significantly over the frequencies. It is not only the level of interference that affects the performance in a cell. The variation in the interference has an even more important effect on the performance, as the fluctuation in interference leads to a high unpredictability in the interference profile, thus making it hard to produce reliable interference estimations.        
These two limitations have consequences both on the performance in the cell itself, i.e. on the intra-cell performance, as well as on the inter-cell performance, i.e. how a cell affects its neighbors.
FIG. 2b illustrates required SINR as a function of frequency resource blocks for a scheduling interval corresponding to the resource allocation illustrated in FIG. 2a, as well as a mean value of the required SINR for each resource block. The required SINR is the level needed to meet the requirements for a correct reception for a specific MCS and a number of allocated resource blocks. The MCS and the number of resource blocks are obtained from 3GPP tables, whereas the resulting required SINR thresholds are determined from measurements in an LTE system based on the performance of turbo decoders. The large variance of the required SINR over the resource blocks is clearly illustrated and is due to that the UEs transmit only on a part of the resource blocks, with unused resource blocks in between. In resource blocks 0-10, UE1 is transmitting and the required SINR is 3.5 dB. In resource blocks 10-20 there is no transmission so the required SINR level goes down. For illustration purposes, a floor of −10 dB is set for the non-utilized resource blocks. In resource block 20-30 the required SINR level goes up to 19 dB when UE2 is transmitting, and in resource blocks 40-50 the required SINR level is 9 dB. This variance may be translated into a large variance in the inter-cell interference levels.
With conventional LA, an MCS of highest order, also referred to as the most efficient MCS, is chosen for a certain transmit/receive power, a desired Transport Block Size (TBS) and the resulting SINR based on the prevailing channel quality. However, the highest order of MCS typically means assigning the transport block to the smallest possible amount of resource blocks, which requires a high SINR. With a high SINR requirement, more power needs to be transmitted/received in order to reach a satisfactory performance for a given channel quality. A higher SINR requirement may thus be translated into a higher transmit power, and consequently into a higher interference to other cells.
In addition to a potentially higher interference, transmissions on only parts of the available resource blocks cause large fluctuations in the interference. These fluctuations would significantly affect a performance of decoders and many other functions such as LA and scheduling, since the performance is dependent on a reliable prediction of the interference.