Orthogonal Frequency-Division Multiplexing (OFDM) is a prominent multi-carrier transmission technique used in wireless communication. Mostly, wireless channels are frequency selective, thereby leading to rise in inter-symbol interference. OFDM assists in combating inter-symbol interference caused by frequency selective nature of wireless channels and is hence a useful multi carrier technique. In satellite telecommunications, a downlink is the link from a satellite down to one or more ground stations or receivers. Long Term Evolution (LTE) is a standard for high-speed wireless communication for mobile devices and data terminals. OFDM has been adopted as a transmission strategy for downlink in LTE systems since release 8.
In OFDM transmission, modulated data is loaded on to a set of sub-carriers followed by an Inverse Fast Fourier Transform (IFFT) operation on them. After addition of a cyclic prefix, the resultant block of symbols is transmitted over the channel. At the receiver, after removing the cyclic prefix, Fast Fourier Transform (FFT) operation is performed over the symbol block to recover the transmitted symbol. As said earlier, these symbols are affected by the selectivity of the channel, when the signal is transmitted through a channel. To reduce channel effects, equalization is carried out at the receiver side, for all the symbols using an estimate of the channel parameters experienced by the symbols over each of the subcarriers. The process of estimating the channel parameters is referred to as channel estimation. After equalization, the received symbols are recovered using demodulation techniques.
In LTE downlink systems, the number of subcarriers in one OFDM symbol depends on the bandwidth selected. (For example. 1.4 MHz, 5 MHz, 10 MHz, and 20 MHz). Typically, 14 or 12 such OFDM symbols constitute one sub-frame. The smallest time-frequency unit for downlink transmission is denoted by a resource element (RE). Each RE contains a modulated symbol. To facilitate channel estimation, some of the resource elements are reserved for transmitting pilot symbols that are known at the User Equipment (UE) side or on the receiver side. There are six different reference signal configurations for different transmitting strategies, each with their own unique reference symbol positions across the resource grid. Out of these, Cell Specific Reference Signals (CRS) are present in all the downlink sub-frames for frame structure type 1, that is for a Frequency Division Duplex (FDD) system and are scattered in lattice fashion to cover the entire resource grid across time and frequency. Hence these are vital in estimating the channel.
In practice, wireless channels exhibit selectivity in time as well as frequency domains. This doubly-selective nature of the channel necessitates dynamic estimation of the channel at the receiver side. The receiver systems adopt various techniques to estimate the channel, using reference signals, at the REs are already known. The optimal channel estimator at the receiver systems for such an arrangement is based on 2-D MMSE based interpolation. However, the existing receiver systems implement 1-D estimators due to the complexity of such an estimator. Usually, the channel is estimated at the reference positions using least squares (LS) or MMSE techniques. The channel estimated at the reference positions is then interpolated across time and frequency axes to get an estimate for non-reference positions. Interpolation can be linear or MMSE based, with the latter being superior to the former in terms of performance. However, MMSE based interpolation requires knowledge of the channel statistics, which is not feasible in practice due to the rapid changes in the wireless environment.
Existing receiver systems provide a theoretical method of implementing MMSE based interpolation in frequency domain by calculating the auto covariance matrix of the channel in the frequency domain. Practically, the receiver does not have knowledge of the auto-covariance matrix to perform MMSE based interpolation. An inaccurate or wrong auto auto-covariance matrix, when used for interpolation degrades performance of reception. It is more severe in the case of a highly frequency selective channel. Given the dynamic nature of the channel, the receiver system needs to calculate auto-covariance matrix after every time interval.