The 3rd-Generation Partnership Project (3GPP) is continuing development of the fourth-generation wireless network technologies known as Long-Term Evolution (LTE). Improved support for heterogeneous network operations is part of the ongoing specification of 3GPP LTE Release-10, and further improvements are being discussed in the context of new features for Release-11. In heterogeneous networks, a mixture of cells of different sizes and overlapping coverage areas are deployed.
One example of such a deployment is seen in the system 100 illustrated in FIG. 1, where several pico-cells 120, each with a respective coverage area 150, are deployed within the larger coverage area 140 of a macro-cell 110. The system 100 of FIG. 1 is suggestive of a wide-area wireless network deployment. However, other examples of low power nodes, also referred to as “points,” in heterogeneous networks are home base stations and relays.
Throughout this document, nodes or points in a network are often referred to as being of a certain type, e.g., a “macro” node, or a “pico” point. However, unless explicitly stated otherwise, this should not be interpreted as an absolute quantification of the role of the node or point in the network but rather as a convenient way of discussing the roles of different nodes or points relative to one another. Thus, a discussion about macro- and pico-cells could just as well be applicable to the interaction between micro-cells and femto-cells, for example.
One aim of deploying low-power nodes such as pico base stations within the macro coverage area is to improve system capacity, by means of cell-splitting gains. In addition to improving overall system capacity, this approach also allows users to be provided with a wide-area experience of very-high-speed data access, throughout the network. Heterogeneous deployments are in particular effective to cover traffic hotspots, i.e., small geographical areas with high user densities. These areas can be served by pico cells, for example, as an alternative deployment to a denser macro network.
The most basic means to operate heterogeneous networks is to apply frequency separation between the different layers. For instance, the macro-cell 110 and pico-cells 120 pictured in FIG. 1 can be configured to operate on different, non-overlapping carrier frequencies, thus avoiding any interference between the layers. With no macro-cell interference towards the under-laid cells, cell-splitting gains are achieved when all resources can simultaneously be used by the under-laid cells.
One drawback of operating layers on different carrier frequencies is that it may lead to inefficiencies in resource utilization. For example, if there is a low level of activity in the pico-cells, it could be more efficient to use all carrier frequencies in the macro-cell, and then basically switch off the pico-cells. However, the split of carrier frequencies across layers in this basic configuration is typically done in a static manner.
Another approach to operating a heterogeneous network is to share radio resources between layers. Thus, two (or more) layers can use the same carrier frequencies, by coordinating transmissions across macro- and under-laid cells. This type of coordination is referred to as inter-cell interference coordination (ICIC). With this approach, certain radio resources are allocated to the macro cells for a given time period, whereas the remaining resources can be accessed by the under-laid cells without interference from the macro cell. Depending on the traffic situations across the layers, this resource split can change over time to accommodate different traffic demands. In contrast to the earlier described static allocation of carrier frequencies, this way of sharing radio resources across layers can be made more or less dynamic depending on the implementation of the interface between the nodes. In LTE, for example, an X2 interface has been specified in order to exchange different types of information between base station nodes, for coordination of resources. One example of such information exchange is that a base station can inform other base stations that it will reduce transmit power on certain resources.
Time synchronization between base station nodes is generally required to ensure that ICIC across layers will work efficiently in heterogeneous networks. This is of particular importance for time-domain-based ICIC schemes, where resources are shared in time on the same carrier.
Orthogonal Frequency-Division Multiplexing (OFDM) technology is a key underlying component of LTE. As is well known to those skilled in the art, OFDM is a digital multi-carrier modulation scheme employing a large number of closely-spaced orthogonal sub-carriers. Each sub-carrier is separately modulated using conventional modulation techniques and channel coding schemes. In particular, 3GPP has specified Orthogonal Frequency Division Multiple Access (OFDMA) for the downlink transmissions from the base station to a mobile terminal, and single carrier frequency division multiple access (SC-FDMA) for uplink transmissions from a mobile terminal to a base station. Both multiple access schemes permit the available sub-carriers to be allocated among several users.
SC-FDMA technology employs specially formed OFDM signals, and is therefore often called “pre-coded OFDM” or Discrete-Fourier-Transform (DFT)-spread OFDM. Although similar in many respects to conventional OFDMA technology, SC-FDMA signals offer a reduced peak-to-average power ratio (PAPR) compared to OFDMA signals, thus allowing transmitter power amplifiers to be operated more efficiently. This in turn facilitates more efficient usage of a mobile terminal's limited battery resources. (SC-FDMA is described more fully in Myung, et al, “Single Carrier FDMA for Uplink Wireless Transmission,” IEEE Vehicular Technology Magazine, vol. 1, no. 3, September 2006, pp. 30-38.)
The basic LTE physical resource can be seen as a time-frequency grid. This concept is illustrated in FIG. 2, which shows a number of so-called subcarriers in the frequency domain, at a frequency spacing of Δf, divided into OFDM symbol intervals in the time domain. Each individual element of the resource grid 210 is called a resource element 220, and corresponds to one subcarrier during one OFDM symbol interval, on a given antenna port. One aspect of OFDM is that each symbol 230 begins with a cyclic prefix 240, which is essentially a reproduction of the last portion of the symbol 230 affixed to the beginning. This feature minimizes problems from multipath, over a wide range of radio signal environments.
In the time domain, LTE downlink transmissions are organized into radio frames of ten milliseconds each, each radio frame consisting of ten equally-sized subframes of one millisecond duration. This is illustrated in FIG. 3, where an LTE signal 310 includes several frames 320, each of which is divided into ten subframes 330. Not shown in FIG. 3 is that each subframe 330 is further divided into two slots, each of which is 0.5 milliseconds in duration.
LTE link resources are organized into “resource blocks,” defined as time-frequency blocks with a duration of 0.5 milliseconds, corresponding to one slot, and encompassing a bandwidth of 180 kHz, corresponding to 12 contiguous sub-carriers with a spacing of 15 kHz. Resource blocks are numbered in the frequency domain, starting with 0 from one end of the system bandwidth. Two time-consecutive resource blocks represent a resource block pair, and correspond to the time interval upon which scheduling operates. Of course, the exact definition of a resource block may vary between LTE and similar systems, and the inventive methods and apparatus described herein are not limited to the numbers used herein.
In general, however, resource blocks may be dynamically assigned to mobile terminals, and may be assigned independently for the uplink and the downlink. Depending on a mobile terminal's data throughput needs, the system resources allocated to it may be increased by allocating resource blocks across several sub-frames, or across several frequency blocks, or both. Thus, the instantaneous bandwidth allocated to a mobile terminal in a scheduling process may be dynamically adapted to respond to changing conditions.
For scheduling of downlink data, the base station transmits control information in each subframe. This control information identifies the mobile terminals to which data is targeted and the resource blocks, in the current downlink subframe, that are carrying the data for each terminal. The first one, two, three, or four OFDM symbols in each subframe are used to carry this control signaling. In FIG. 4, a downlink subframe 410 is shown, with three OFDM symbols allocated to control region 420. The control region 420 consists primarily of control data elements 434, but also includes a number of reference symbols 432, used by the receiving station to measure channel conditions. These reference symbols 432 are interspersed at pre-determined locations throughout the control region 420 and among the data symbols 436 in the data portion 430 of the subframe 410.
Transmissions in LTE are dynamically scheduled in each subframe, where the base station transmits downlink assignments/uplink grants to certain mobile terminals (user equipment, or UEs, in 3GPP terminology) via the physical downlink control channel (PDCCH). The PDCCHs are transmitted in the control region of the OFDM signal, i.e., in the first OFDM symbol(s) of each subframe, and span all or almost all of the entire system bandwidth. A UE that has decoded a downlink assignment, carried by a PDCCH, knows which resource elements in the subframe that contain data aimed for that particular UE. Similarly, upon receiving an uplink grant, the UE knows which time-frequency resources it should transmit upon. In the LTE downlink, data is carried by the physical downlink shared channel (PDSCH) and in the uplink the corresponding channel is referred to as the physical uplink shared channel (PUSCH).
LTE also employs multiple modulation formats, including at least QPSK, 16-QAM, and 64-QAM, as well as advanced coding techniques, so that data throughput may be optimized for any of a variety of signal conditions. Depending on the signal conditions and the desired data rate, a suitable combination of modulation format, coding scheme, and bandwidth is chosen, generally to maximize the system throughput. Power control is also employed to ensure acceptable bit error rates while minimizing interference between cells. In addition, LTE uses a hybrid-ARQ (HARQ) error correction protocol where, after receiving downlink data in a subframe, the terminal attempts to decode it and reports to the base station whether the decoding was successful (ACK) or not (NACK). In the event of an unsuccessful decoding attempt, the base station can retransmit the erroneous data.