The present invention generally relates to wireless communication receivers, and particularly relates to signal demodulation in wireless communication receivers.
Communication over wireless networks occurs by modulating information-bearing waveforms onto carriers, e.g., sinusoid carriers. Modulated signals are transmitted through wireless communication channels to one or more receivers, subjecting the signals to noise and interference. Wireless communication channels may be modeled as linear time-varying systems. Using a time-varying linear model of the channel, the relationship between a transmitted signal x(t) and a received signal r(t) is given by:r(t)=∫g(t,τ)×(t−τ)dτ+z(t)  (1)where z(t) is an Additive White Gaussian Noise (AWGN) function and g(t, τ) is the time-varying instantaneous channel impulse response, where time variations of the channel are represented by the variable t and time dispersiveness or spread of the channel is represented by the variable τ. An accurate model or estimate of the channel impulse response g(t, τ) is used to properly reconstruct transmitted signals, e.g., to properly restore amplitude and phase information of transmitted symbols, thus enabling coherent signal demodulation.
Some wireless communication systems employ time and frequency selective modulation techniques. In such systems, both the frequency and time selectivity of a wireless communication channel is estimated to ensure proper demodulation of signals transmitted over the channel, where frequency selectivity is a measure of channel response variation with respect to the frequency of a transmitted signal and time selectivity is a measure of channel response variation with respect to the movement of a transmitter and/or receiver. In single carrier transmission networks such as Code Division Multiple Access (CDMA) based networks, the frequency selectivity of the channel is conventionally estimated by correlating a received signal with a known pilot signal and time selectivity is conventionally measured by observing a multiplicity of the pilot signals periodically inserted over time.
In Orthogonal Frequency Division Multiplexing (OFDM) based networks, data symbols are modulated onto orthogonal time-frequency units defined by the frequency sub-carriers of an OFDM symbol. As such, a wireless communication channel in an OFDM network is conventionally described by a time-varying frequency response H(t,f) as given by:H(t,f)=∫g(t,τ)e−j2πfτdτ  (2)
Channel estimation in OFDM networks is conventionally accomplished by replacing data symbols with known pilot symbols across time, frequency, or both such that the time-varying impulse response of the channel may be interpolated using the known pilot symbols. One conventional approach, referred to as block-type pilot channel estimation, is based on a slow fading channel model and is performed by inserting known pilot symbols into all subcarriers of OFDM symbols within a specified symbol period, e.g., every N block of OFDM symbols. Some conventional block-type pilot channel estimators are Least-Square (LS) estimators, Minimum Mean-Square Error (MMSE) estimators, and modified MMSE estimators. Block-type pilot channel estimation models are not suitable for fast fading channels where channel response may vary between OFDM symbol blocks. Channel estimation in fast fading OFDM channels is conventionally done using a comb-type pilot estimation technique where known pilot symbols are inserted into a subset of the OFDM subcarriers of each OFDM symbol block. Some conventional comb-type pilot channel estimators are LS estimators, Maximum Likelihood (ML) estimators, and parametric channel modeling-based estimators.
Conventional channel estimators, such as those described above, model the instantaneous impulse response of a wireless communication channel to a pilot signal and use the modeled response to perform receive signal demodulation in accordance with equation (1). However, the time selectivity of a wireless communication channel in a land-based mobile communication environment originates mainly from the movement of transmit and receive terminals within the environment. As a result, channel time selectivity arises mainly from Doppler shift, i.e., the change in distance between a transmitter and receiver with time, which manifests itself as a change in transmission delay, phase and path loss with time.