With the exponential growth of wireless communication, new techniques are needed to handle the high capacity of voice and data carried over wireless communication networks. The 3rd Generation Partnership Project (3GPP) Long Term Evolution (LTE) (referred to as “LTE” hereinafter) is a promising network proposal to meet the challenge of increased traffic.
For LTE, the orthogonal frequency-division multiplexing (OFDM) modulation scheme is chosen for the transmission of the downlink signals between a transmitter, such as a base station, and a terminal/receiver, such as user equipment (UE) (e.g., mobile communication devices such as cell phones, etc.). Meanwhile, a special type of modulation method, which is termed single-carrier frequency-division multiple access (SC-FDMA), is used for the transmission of uplink signals.
Detailed information on LTE can be found in Rumney, LTE and the Evolution of 4G Wireless, John Wiley, © 2009, and Sesia, LTE: The UMTS Long Term Evolution, Wiley © 2009, and the standard documents for E-UTRA: 3GPP TS 36.211: “Evolved Universal Terrestrial Radio Access (E-UTRA); Physical channels and modulation;” 3GPP TS 36.212: “Evolved Universal Terrestrial Radio Access (E-UTRA); Multiplexing and channel coding;”3GPP TS 36.213: “Evolved Universal Terrestrial Radio Access (E-UTRA); Physical layer procedures” the disclosures of which are incorporated by reference herein.
OFDM is a multi-carrier modulation scheme used in many digital communication systems. In OFDM, a large number of closely spaced orthogonal subcarriers are used to transmit data. The data are divided into several parallel data streams, one for each sub-carrier. Each sub-carrier is modulated with a conventional modulation scheme such as QAM, PSK, BPSK, or QPSK, at a low symbol rate while maintaining total data rate similar to single carrier modulation schemes in the same channel bandwidth. The baseband signal in an OFDM system is the sum of these modulated sub-carriers, which is then used to modulate a main RF signal. An important aspect of the demodulation of such a signal, and thereby retrieving the underlying baseband signal, involves processing it by a Fast Fourier Transform (FFT). The benefits of OFDM are high spectral efficiency, resiliency to radio-frequency (RF) interference and multi-path propagation.
In all communication systems, including LTE, variations in the phase and amplitude are introduced into the transmitted signals as they propagate along the channel. These variations are referred to as the channel response, and the channel response is usually frequency and time-dependent. If the receiver can determine the channel response, the received signal can be corrected to compensate for the channel degradation. The determination of the channel response is called channel estimation. In the currently defined LTE system (3GPP Release 8), a number of resource elements have been chosen to carry pilot signals (also known as “reference signals”) for channel estimation purposes. The pilot signals contain known information that permits the channel estimator to determine the channel response on that carrier frequency at that particular instant in time by comparing the actually received signal with an expected signal, i.e. one that the receiver would have received under ideal channel conditions. The resource elements conveying the pilot signals are distributed in the time and frequency domains in a pilot signal pattern that is defined in the LTE standard (3GPP Release 8), and which permits the channel response of the resource elements not containing any pilot signal to be accurately estimated by interpolating the channel responses determined for the pilot signal-carrying resource elements (referred to as “pilot resource elements” hereinafter). Conventional interpolation methods include minimum mean-square error (MMSE) estimation, least-square (LS) estimation, linear interpolation and averaging.
The traditional methods of OFDM channel estimation can be divided into two major steps: the first step is an LS estimation that is carried out on the pilot resource elements, and the second step is an MMSE interpolation that is subsequently carried out on both the time and frequency domains to estimate the rest of the resource elements. These kinds of channel estimation methods which carry out MMSE interpolation on both the time and frequency domains are referred to as 2D MMSE channel estimation. The 2D MMSE estimation method exploits the channel correlation that is typically present along both the time axis and the frequency axis, and in general provides an acceptable level of performance in terms of frame error rate (FER). However, due to the wide signal carrier bandwidth and the use of multiple antennas, this type of channel estimation method is far too complex to be implemented on LTE systems and therefore less complex versions that provide performance close to that of a 2D MMSE estimation have been developed. Separable 2D MMSE estimation is an example of a commonly-used channel estimation method with lower complexity. In this method, an MMSE estimation is first performed in one dimension based on the channel correlation in that dimension and then an MMSE estimation is performed in another dimension which in turn exploits the correlation in that dimension.
Another way of simplifying the channel estimation process is by replacing MMSE interpolation with less complex algorithms, such as linear interpolation and averaging.
In general, linear interpolation is a mathematical operation for estimating values that lie between two known values or points. Given two known points A and B with Cartesian coordinates A=(xA,yA) and B=(xB,yB), the ordinate yP of an interpolated point with abscissa xp is calculated with the below formula for linear interpolation:
                              y          p                =                              y            A                    +                                                                      x                  p                                -                                  x                  A                                                                              x                                                                                                    ⁢                    B                                                  -                                  x                  A                                                      ⁢                          (                                                y                  B                                -                                  y                  A                                            )                                                          (        1        )            In the context of channel estimation, x would denote the location of a resource element in the time or frequency domain, while y would denote the value of an estimated channel parameter. In the actual implementations of the linear interpolation method, simplified versions of equation (1) are sometimes derived in order to avoid the circuital complexity required to perform the calculations of equation (1).
Averaging in the context of channel estimation refers to the addition of the estimated channel parameters for two or more pilot resource elements, and subsequently dividing the sum total by the number of sampled pilot resource elements.
The simplification of the traditional OFDM channel estimation methods by means of linear interpolation and averaging has led to a reduction in hardware complexity, but concurrently the estimation performance is sacrificed.
Once the channel estimation is completed for the entire time-frequency grid within one downlink subframe, the receiver uses the channel estimate to determine, from the received data symbols, the original transmitted data signals. The receiver then performs symbol demapping, de-interleaving and decoding on the equalized data symbols in accordance with the coding and modulation schemes used for the transmitted data.
Channel estimation is one of the most critical parts in an OFDM system for obtaining good performance Accurate channel estimation in an OFDM signal receiver is crucial for the recovery of the transmitted information data at the receiver, so it is very important for the interpolation performance to be of sufficiently high quality. However, as illustrated in the previous paragraphs, there is usually a compromise between hardware complexity and the channel estimation performance. For the case of LTE, since the pilot signal pattern has already been defined in the standard, the channel estimation methods in the receiver must be designed to take full advantage of the available pilot signals embedded in the transmitted signal while taking into account the workload that will be imposed onto the computational hardware. Although there are existing channel estimation techniques that are able to provide an acceptable level of performance, a majority of these techniques are far too complex to be implemented on LTE user equipment in the near future.
Therefore, considering that the commercialization of the LTE system is already underway, there is now a strong need in the art for an improved method of channel estimation for LTE terminal receivers which combines good performance with low hardware complexity.