Conventional wireless networks are implemented as cellular systems that include a number of base stations. A base station communicates with user terminals that are located inside the cell served by the base station. Unfortunately, cellular wireless networks are subject to inter-cell interference problems that can limit the capacity (data rates) of the network. Inter-cell interference is the interference experienced at a first user terminal due to other user terminals in cells outside the cell in which the first user terminal is located.
Orthogonal Frequency Division Multiple Access (OFDMA) is a multiple access scheme for “fourth generation” (4G) cellular wireless standards, such as Long-Term Evolution (LTE), LTE-Advanced, and WiMAX (Worldwide Interoperability for Microwave Access). Among the advantages offered by OFDMA is its scheduling flexibility, because users can be scheduled in both time and frequency, which can be exploited to gain time, frequency, and multi-user diversity.
In order to achieve high data rates in 4G and beyond-4G networks, aggressive frequency reuse is inevitable due to the scarcity of the radio resources. Reuse 1 (universal reuse), in which all radio resources are reused in every sector, is an example of an aggressive frequency reuse scheme. While reuse 1 can potentially achieve high aggregate system throughput, it jeopardizes the throughput experienced by user terminals close to the edge of a cell, due to the inter-cell interference experienced by those user terminals. Therefore, it is important for the network to use robust and efficient interference mitigation techniques.
Conventionally, inter-cell interference is mitigated by static resource partitioning and frequency/sector planning (clustering), in which nearby sectors are assigned orthogonal resources. A common example is reuse 3 (cluster size=3 sectors), where adjacent sectors are assigned orthogonal channels. Although such techniques can reduce inter-cell interference and improve cell-edge throughput, they suffer from at least two drawbacks: static resource planning, and a priori frequency/sector planning. With static resource planning, the aggregate network throughput is significantly reduced because each sector can use only a fraction of the available resources, equal to the reciprocal of the reuse factor (e.g., in reuse 3, only one-third of the resources are available to a sector). Conventional a priori frequency/sector planning may not be possible in emerging wireless networks where new multi-tier network elements (such as relays, femto-/pico-base stations, and distributed antenna ports) are expected to be installed in an ad hoc manner, without prior planning.
In an attempt to address the effects of static resource partitioning and a priori frequency/sector planning on aggregate network throughput, Fractional Frequency Reuse (FFR) schemes have been proposed. The key idea in FFR is to assign lower reuse factors for user terminals near the cell center and higher reuse factors for user terminals at the cell edge. The motivation behind FFR is that cell-edge user terminals are more vulnerable to inter-cell interference than cell-center user terminals. Soft Frequency Reuse (SFR) and Partial Frequency Reuse (PFR) are two variations of FFR. While FFR schemes recover some of the throughput lost due to static resource partitioning, they require frequency/cell planning a priori, which is not compatible with the way future cellular networks are expected to be installed as mentioned above. As a result, an efficient inter-cell interference coordination (ICIC) scheme would be beneficial to the success of future cellular networks.