An ever-increasing number of relatively inexpensive, low power wireless data communication services, networks and devices have been made available over the past number of years, promising near wire speed transmission and reliability. Various wireless technology is described in detail in the 802 IEEE Standards, including for example, the IEEE Standard 802.11a (1999) and its updates and amendments, the IEEE Standard 802.11g (2003), and the IEEE Standard 802.11n now in the process of being adopted, all of which are collectively incorporated herein fully by reference. These standards have been or are in the process of being commercialized with the promise of 54 Mbps or higher data rate, making them a strong competitor to traditional wired Ethernet and the more common “802.11b” or “WiFi” 11 Mbps mobile wireless transmission standard.
Generally speaking, transmission systems compliant with the IEEE 802.11a and 802.11g or “802.11a/g” as well as the 802.11n standards achieve their high data transmission rates using Orthogonal Frequency Division Multiplexing (OFDM) encoded symbols mapped up to a 64 quadrature amplitude modulation (QAM) multi-carrier constellation. Generally speaking, the use of OFDM divides the overall system bandwidth into a number of frequency sub-bands or channels, with each frequency sub-band being associated with a respective subcarrier, or carrier frequency. Data upon each subcarrier may be modulated with a modulation scheme such as quadrature amplitude modulation, phase shift keying, etc. Thus, each frequency sub-band of the OFDM system may be viewed as an independent transmission channel within which to send data, thereby increasing the overall throughput or transmission rate of the communication system.
Generally, transmitters used in the wireless communication systems that are compliant with the aforementioned 802.11a/802.11g/802.11n standards as well as other standards such as the 802.16 IEEE Standard, perform multi-carrier OFDM symbol encoding (which may include error correction encoding and interleaving), convert the encoded symbols into the time domain using Inverse Fast Fourier Transform (IFFT) techniques, and perform digital to analog conversion and conventional radio frequency (RF) upconversion on the signals. These transmitters then transmit the modulated and upconverted signals after appropriate power amplification to one or more receivers, resulting in a relatively high-speed time domain signal with a large peak-to-average ratio (PAR).
Likewise, the receivers used in the wireless communication systems that are compliant with the aforementioned 802.11a/802.11g/802.11n and 802.16 IEEE standards generally include an RF receiving unit that performs RF downconversion and filtering of the received signals (which may be performed in one or more stages), and a baseband processor unit that processes the OFDM encoded symbols bearing the data of interest. Generally, the digital form of each OFDM symbol presented in the frequency domain is recovered after baseband downconversion, conventional analog to digital conversion and Fast Fourier Transformation of the received time domain analog signal. Thereafter, the baseband processor performs frequency domain equalization (FEQ) and demodulation to recover the transmitted symbols. The recovered and recognized stream of symbols is then decoded, which may include deinterleaving and error correction using any of a number of known error correction techniques, to produce a set of recovered signals corresponding to the original signals transmitted by the transmitter.
In wireless communication systems, the RF modulated signals generated by the transmitter may reach a particular receiver via a number of different propagation paths, the characteristics of which typically change over time due to the phenomena of multi-path and fading. Moreover, the characteristics of a propagation channel differ or vary based on the frequency of propagation. To compensate for the time varying, frequency selective nature of the propagation effects, and generally to enhance effective encoding and modulation in a wireless communication system, each receiver of the wireless communication system may periodically develop or collect channel state information (CSI) for each of the frequency channels, such as the channels associated with each of the OFDM sub-bands discussed above. Generally speaking, CSI is information defining or describing one or more characteristics about each of the OFDM channels (for example, the gain, the phase and the SNR of each channel). Upon determining the CSI for one or more channels, the receiver may send this CSI back to the transmitter, which may use the CSI for each channel to precondition the signals transmitted using that channel so as to compensate for the varying propagation effects of each of the channels.
To further increase the number of signals which may be propagated in the communication system and/or to compensate for deleterious effects associated with the various propagation paths, and to thereby improve transmission performance, it is known to use multiple transmit and receive antennas within a wireless transmission system. Such a system is commonly referred to as a multiple-input, multiple-output (MIMO) wireless transmission system and is specifically provided for within the 802.11n IEEE Standard now being adopted. Further, the 802.16 standard, or WiMAX, applies to cell-based systems and supports MIMO techniques. Generally speaking, the use of MIMO technology produces significant increases in spectral efficiency and link reliability of IEEE 802.11, IEEE 802.16, and other systems, and these benefits generally increase as the number of transmission and receive antennas within the MIMO system increases.
In addition to the frequency channels created by the use of OFDM, a MIMO channel formed by the various transmit and receive antennas between a particular transmitter and a particular receiver includes a number of independent spatial channels. As is known, a wireless MIMO communication system can provide improved performance (e.g., increased transmission capacity) by utilizing the additional dimensionalities created by these spatial channels for the transmission of additional data. Of course, the spatial channels of a wideband MIMO system may experience different channel conditions (e.g., different fading and multi-path effects) across the overall system bandwidth and may therefore achieve different SNRs at different frequencies (i.e., at the different OFDM frequency sub-bands) of the overall system bandwidth. Consequently, the number of information bits per modulation symbol (i.e., the data rate) that may be transmitted using the different frequency sub-bands of each spatial channel for a particular level of performance may differ from frequency sub-band to frequency sub-band.
However, instead of using the various different transmission and receive antennas to form separate spatial channels on which additional information is sent, better transmission and reception properties can be obtained in a MIMO system by using each of the various transmission antennas of the MIMO system to transmit the same signal while phasing (and amplifying) this signal as it is provided to the various transmission antennas to achieve beamforming or beamsteering. Generally speaking, beamforming or beamsteering creates a spatial gain pattern having one or more high gain lobes or beams (as compared to the gain obtained by an omni-directional antenna) in one or more particular directions, while reducing the gain over that obtained by an omni-directional antenna in other directions. If the gain pattern is configured to produce a high gain lobe in the direction of each of the receiver antennas, the MIMO system can obtain better transmission reliability between a particular transmitter and a particular receiver, over that obtained by single transmitter-antenna/receiver-antenna systems.
Various challenges may arise in wireless communication because of signal distortions or interruptions during transmission. One such challenge is synchronizing transmitted information. More specifically, when a frame (or another data unit) is transmitted from a transmitting device to a receiving device, the receiving device may need to establish and/or compensate for the frame timing (FT), e.g., the boundaries of the data unit, or, in other words, where the data unit starts and/or ends. Further, if the transmitting device is transmitting data via multiple carrier frequencies (e.g., using the OFDM scheme), the receiving device may need to establish and/or compensate for a carrier frequency offset (CFO), e.g., due to a difference between the carrier frequency of the transmitted signal and a frequency of the receiver's local oscillator.
Various synchronization techniques have been developed to address this challenge. Some of the most common techniques use a preamble symbol structure as the first symbol structure to mark a start of the frame, and, more generally, to provide a reference for frame synchronization. The preamble symbol structure establishes fixed relationships among subcarriers of the transmitted signal. The fixed relationships established through the use of the preamble symbol structure allow the devices, mobile stations, and the network to use or continue to transmit the information contained in the signal by minimizing the effects of distortions and interruptions. This can be achieved, for example, by using the preamble symbol structure to adjust a timing offset, a frequency offset, or transmitted signal power.
The preamble symbol can be arranged in a variety of ways to achieve a usable reference for frame synchronization. In some common synchronization techniques, often referred to as cross-correlation based synchronization techniques, a predefined preamble signal may be used that is different from other signals, and where the difference is detectable. The detectable difference may be created, for example, by power boosting the preamble symbols to a higher dB value than the other symbols, by using different modulation techniques for the preamble signal, by allocating the subcarriers of the preamble signal in a particular pattern, and so on. Thus, when a receiving device receives a signal, the received signal may then be cross-correlated with the predefined preamble signal to detect the presence, start, etc., of the predefined preamble signal.
One drawback of cross-correlation based synchronization techniques is that they require the receiving device to know the predetermined preamble signal, and possibly numerous predetermined preamble signals associated with different transmitting devices. This may lead to increased computational complexity, inefficient use of resources, higher costs, etc. Therefore, some synchronization techniques, often referred to as auto-correlation based synchronization techniques, employ preamble signals that are repetitive in the time domain. Exploiting the repetition property of the preamble signal, the receiver may determine the FT based on the auto-correlation of the received signal without the exact knowledge on the preamble signal. However, conventional auto-correlation based synchronization techniques are suboptimal.