Wireless communication systems suffer from multipath fading especially when the data rate is beyond long-term evolution (LTE)-Advanced standards. In any terrestrial radio communication system, the signal will travel directly to a receiver (i.e., a direct path) and/or via relays and reflections off of buildings, hills, ground, water, and other objects (i.e., indirect paths). Selective fading occurs when the multipath fading affects different frequencies across the channel to varying degrees. As such, the phases and amplitudes of the channel frequency response will vary over the signal bandwidth. Sometimes relatively deep nulls may be experienced, giving rise to degraded signal reception. Simply maintaining the overall average amplitude of the received signal will not overcome the effects of selective fading, and some form of equalization may be needed.
To combat multipath fading effects, orthogonal frequency division multiplexing (OFDM) techniques are used in existing 4G LTE and IEEE 802.11 WiFi wireless communication systems. OFDM techniques spread the data over a wideband channel consisting of a large number of narrowband subcarriers. When only a portion of the data is lost by nulls of a few narrowband subcarriers, the lost data can be reconstituted using forward error correction techniques, thus mitigating the effects of selective multi-path fading. Code Division Multiple Access (CDMA) schemes such as Direct Sequence Code Division Multiple Access (DS-CDMA) are also used to combat multipath fading but have not been used significantly for relay network communication systems.
OFDM and DS-CDMA systems using multi-path channels each have known drawbacks. For example, CDMA systems using rake receivers exhibit inferior Bit Error Rates (BER) compared to OFDM systems. On the other hand, OFDM systems completely fail under frequency-offset environments (e.g., Doppler frequency shifts caused by relative mobile movements).
To combat the deleterious effects of multiple-access interference (MAI), the conventional approach in the CDMA scheme has been to employ fixed orthogonal user sequences or signatures with low cross-correlation properties. However, the orthogonality or desired cross-correlations of the transmitted sequences is often destroyed when received at the base station or the destination due to multi-path fading, inter-symbol interference, and multi-access interference. Spread-spectrum relay channels with deterministic (fixed) or random spreading sequences are typically used. However, these and other strategies do not improve and secure the signals sufficiently enough for modern communication requirements. Another strategy is to obtain pseudo-noise (PN) sequences by maximizing the signal-to-interference-plus-noise ratio (SINR) with the maximum eigenvalue principle. However, this approach is not designed for relay systems and often does not converge.