The present invention generally relates to wireless communication networks, and particularly relates to noise estimation processing in wireless communication receivers.
Noise estimation represents an integral part of receiver processing in wireless communication networks. For example, many types of modern wireless communication networks use “best-effort” packet data channels, where individual users are served at the highest data rates that can be supported given the prevailing radio and network conditions. Accurate noise estimation at the receivers is essential to the signal quality calculations performed at those receivers and, in turn, those signal quality calculations set the network's selection of serving data rates for the individual users.
If a wireless communication device operating on a best-effort channel reports an erroneously high received signal quality, the supporting network may select a serving data rate that is too high for reliable reception at the device. The built-in retransmission mechanisms, such as the use of Hybrid Automatic Repeat Requests (H-ARQ), commonly adopted for such best-effort channels exacerbate the problems associated with selecting serving data rate that is too high for reception conditions at the receiver, because the repeated retransmission of data packet erroneously received at the device lowers the effective data transmission rate. Indeed, with a high incidence of reception errors at the device, the effective data rate can be significantly lower than would be achieved by selecting a lower data rate more commensurate with the actual received signal quality at the device.
Conversely, if the device reports an erroneously low received signal quality, the network selects a lower data rate than actually could be supported, and the best-effort channel is underutilized with respect to that device. The underutilization can be severe, depending upon the particular data rate setting method adopted by the network. In W-CDMA systems, mobile stations engaged in high-rate packet data services with the network, e.g., High Speed Downlink Packet Access (HSDPA) services, provide received signal quality feedback to the network in the form of transmitted Channel Quality Indicators (CQI).
Basically, the CQI reports from a given mobile station correspond to the signal-to-interference ratio (SIR) as measured by the mobile station for a reference channel signal transmitted from the network sector serving the mobile station. The CQI values reported by the mobile stations are “mapped” into a table of available data rates, and a mobile station that is under-reporting signal quality is thus allocated a lower data rate than its conditions can support.
Receiver frequency error represents a primary source of noise estimation errors. For example, accurate noise estimation at the receiver depends on accurately processing a received reference signal, e.g., received pilot symbols. Any error between the receiver's frequency and the (network) transmitter's frequency gives rise to symbol de-rotation errors, which in turn, cause noise and channel estimation errors at the receiver. Ideally, then, a wireless communication receiver would directly compensate its noise estimation processing based on observed receiver frequency errors.