The following abbreviations are used in the present specification:    AWGN Additive White Gaussian Noise    FCB Frequency Correction Burst    FCCH Frequency Correction Channel    FSC First Signal Component    GMSK Gaussian Minimum Shift Keying    GSM Global System for Mobile Communications    LR Likelihood Ratio    LRV Likelihood Ratio Value    SNR Signal-to-Noise Ratio    TDMA Time Division Multiple Access    UE User Equipment
In GSM (Global System for Mobile Communications) systems, before a wireless device can communicate with a base station, it must first synchronise its local oscillator with the local oscillator of the base station. To facilitate this synchronisation, base stations transmit regular signals known as Frequency Correction Bursts (FCBs).
GSM systems use TDMA (Time Division Multiple Access) channel access methods to divide a frequency channel (i.e. carrier) into a series of channels in the time domain, which can be used for different purposes. More specifically, a frequency channel is divided into a series of TDMA frames, which are in turn divided into a number of time slots. Each time slot is associated with a particular channel and contains data bits to be transmitted to one or more wireless receivers.
Frequency Correction Bursts (FCBs) are transmitted on the Frequency Correction Channel (FCCH), which is a shared downlink channel that can be monitored by any wireless devices in the vicinity of a base station transmitting on that channel. In GSM systems, the #0 timeslot of every 10th or 11th TDMA frame is usually allocated as a FCCH. Thus, depending on the GSM system, an FCB is transmitted by a base station either every 46.2 ms or every 50.8 ms. Each FCB comprises a series of 142 consecutive logical zeros as data bits.
GSM systems use GMSK (Gaussian Minimum Shift Keying) modulation to modulate data bits onto a carrier signal. A GMSK modulated signal can be represented as:s(t)=A cos(2πft+γ(t))  Eqn. 1where γ(t) is the signal phase, f is the carrier frequency, A is the signal amplitude, and t is the time. The signal phase γ(t) can be expressed as:γ(t)=γ0+ΣkxkΓ(t−k·Ts)  Eqn. 2where Ts is the symbol period and xk denotes the kth transmitted symbol which in general can have the values ±1. γ0 is the initial phase and Γ(t) is a function which, on a high level, comprises steps of π/2 per symbol, which are smoothed to achieve a narrow spectrum. It will be appreciated that the signal phase γ(t) is dependent on the transmitted symbol xk. For further details on GMSK modulation, a standard text book on the GSM system can be consulted, e.g. “The GSM System for Mobile Communications” by Michel Mouly and Marie-Bernadette Pautet published by Cell & Sys.
As noted above, an FCB comprises a series of 142 consecutive logical zeros as data bits. In other words the data bits of an FCB can be represented as the vector {tilde over (x)}=[0, . . . , 0]T. Each zero data bit is then mapped by the GMSK modulator to a symbol of +1. In other words, the transmitted symbols xk in an FCB can be represented as the vector x=[1, . . . , 1]T.
As the transmitted symbols xk in an FCB are all +1, it follows from Eqn. 2 that the phase of the modulated signal changes constantly during an FCB. This phase change can be expressed as a constant rotation of φ per symbol. The discretised version of γ(t) can therefore be expressed as:γk=γk−1+φ  Eqn. 3where, in GSM systems, φ=π/2, and where and γk−1 represents the phase of the signal for the previously transmitted symbol.
As will be appreciated, therefore, in GMSK modulation, as an FCB comprises a stream of consecutive logical zeros, a modulated FCB signal is a sinusoidal signal with a phase that changes at a constant rate of φ multiplied by the symbol rate. As the rate of change of the phase of the signal is constant, the frequency of the signal is also constant. Thus, a modulated FCB can be thought of as a sinusoidal signal with a constant frequency (or in other words, a pure tone signal). In the context of GSM, the frequency of the modulated signal is 67.7 kHz above the carrier frequency f.
A wireless device synchronises with a base station by first detecting one or more FCBs transmitted by that base station. This is known as frequency burst detection. The wireless device then uses the detected FCB(s) to synchronise with the base station by, for example, determining the difference between the frequency of its own local oscillator, and the frequency of the FCB. This can in turn be used to determine the frequency offset between the local oscillator of the wireless device and the local oscillator of the transmitting base station (due to the fact that the FCB is a known constant frequency above the carrier frequency).
In some cases, the time at which the FCB was received is also determined, and this information is used by the wireless device to synchronise its time slot boundaries with those of the base station.
There are many existing methods for detecting FCBs. One simple method is to apply a bandpass filter around the expected frequency of the FCB (i.e. carrier frequency+67.7 kHz). The power of the received signal before filtering is then compared to the power of the received signal after filtering. If the power is the same or similar, it is determined that an FCB has been received; whereas, if the power has been reduced significantly, it is assumed that no FCB has been received. Such a method is not useful, however, when there is a large frequency offset between the transmitting base station and the receiving wireless device, because the apparent frequency of a received FCB will be significantly shifted from the expected frequency and will be filtered out by the bandpass filter, leading to a determination that no FCB has been received.
The paper “Low-complexity Frequency Synchronization for GSM Systems: Algorithms and Implementation” by Harald Kroll, Stefan Zwicky, Christian Benkeser, Qiuting Huang and Andreas Burg, as published in “IV International Congress on Ultra Modern Telecommunications and Control Systems 2012”, pages 168 to 173, describes two alternative methods.
In the first of these methods, the phase difference between consecutively received symbols is determined, and the variance of the phase differences is analysed. As explained above, the modulation of the data bits in an FCB gives rise to a constant phase rotation per data bit. In a noiseless system, the variance of the phase differences should therefore be zero. The variance is compared to a threshold value to determine whether it is likely that an FCB has been received. As will be appreciated, the reliability of this method would be reduced the lower the SNR level, because noise will cause the phase differences to vary.
The second of these methods utilises the fact that the FCBs are transmitted periodically, and can therefore give rise to a periodically repeating signal pattern. In this method, the autocorrelation of a received signal is determined, and if the autocorrelation exceeds a certain threshold, it is determined that an FCB has been detected. It will be appreciated that at least two frequency correction bursts need to be received by the wireless device before an FCB can be detected.
There is a need for wireless devices to detect FCBs transmitted by base stations rapidly and reliably, not least because, before a wireless device has detected an FCB from a base station, it will be unsynchronised with that base station and will therefore not be able to communicate with the base station. Also, there is a need to cut down the power consumption of wireless devices, and long FCB monitoring periods increase the power consumption of wireless devices.
Another reason that FCBs need to be detected reliably and quickly, which has emerged relatively recently, is that GSM carriers are beginning to be “refarmed” for use as 3G/LTE carriers. These “refarmed” 3G/LTE carriers may look similar to GSM carriers in a GSM cell search context, and a wireless device may therefore end up spending unnecessary time and power looking for FCBs on a refarmed 3G/LTE carrier, which, of course, will not be a valid GSM carrier for which GSM synchronisation is possible. Thus, a system is needed in which a wireless device can quickly determine whether a frequency correction burst has been received in order to rule out or continue the GSM synchronization procedure.
Ideally, the maximum time taken for an FCB to be detected by a wireless device should be the period between FCB transmissions (i.e. 46.2 ms or 50.8 ms, depending on the GSM system). However, for each of the methods of detecting FCBs discussed above, and indeed many others not mentioned, this period can often be longer. This is because received FCBs can be “missed” or the SNR may be too low for reliable FCB detection. There is therefore a need for a new method of detecting received FCBs, which is fast and reliable.