In a Time Division Duplex (TDD) network, uplink (UL) and downlink (DL) transmissions are done in the same frequency channel, and uplink and downlink transmissions are separated in time. When user equipment (UE) is synchronized to a cell in a TDD network, it will only open its downlink receiver for reception in the time periods allocated for downlink transmissions. In the initial synchronization phase where the user equipment is first trying to receive synchronization signals in a cell, the user equipment is not yet aware of the frame synchronization in the cell. In this synchronization phase, the user equipment needs to search continuously on the channel frequency for some time in order to find the synchronization channels.
In Long Term Evolution (LTE) TDD networks, synchronization channels are transmitted from the base station (eNodeB) with an interval of 5 ms. The minimum time required for the search is therefore 5 ms plus the length of the synchronization symbol. During this search time, the user equipment receiver will capture all signals transmitted on the channel frequency.
A problem may exist for the user equipment if there is another user equipment nearby which is already synchronized to the same cell (or in the same network) and is allocated uplink transmissions while the first user equipment is capturing its search signal. FIG. 1A illustrates this situation. An interfering signal as seen from the first user equipment 100 can be extremely high if the second user equipment 102 is close to the first user equipment 100. One particularly difficult situation occurs when the downlink signal from a base station 104 is weak and the second user equipment 102 is close to the first user equipment 102.
If the downlink signals are close to or below a reference sensitivity level and the second user equipment 102 is within a few meters of the first user equipment 100, then the signal level difference between the desired signal and the interfering signal can be around 100 dBs. Although the desired and the interfering signals are separated in time, this scenario represents a problem to the receiver's Automatic Gain Control (AGC).
In another difficult situation, there is only a very short time between the uplink region and the synchronization channels in the radio time frame. The AGC has to settle to a usable radio frequency (RF) gain before the synchronization channel is received. During the uplink region, the AGC might have been seriously misadjusted, i.e. the radio frequency gain might be set very low to adapt to the strong UL signal. During the transition region from uplink to the start of synchronization channels, the AGC has to iterate towards a radio frequency gain setting suitable for receiving the small downlink signal.
In this transition region, there is a further risk that there are only Common Reference Signals (CRS) transmitted from the eNodeB. This will be the situation if there are no download allocations to any user equipment during this period. As CRS symbols are transmitted with a separation in time of three or four symbols, this issue puts a further requirement on the length of the signal power measurement that is done by the AGC to estimate the signal level.
Because of the minimum power measurement length for each AGC iteration, there is a limitation on the number of iterations that it is possible to do in the transition region. The AGC is normally only able to step the gain up (or down) by a certain number of decibels for each iteration. The step size depends on for example an assessment of the received signal level using the current gain setting.
Another problem to the AGC occurs when there are no uplink allocations in the TDD uplink region and the user equipment that needs to synchronize to the LTE cell is very close to an eNodeB. FIG. 1B illustrates this situation. In this situation, the user equipment 106 could be receiving a very weak signal in the uplink region of the time frame. In theory, the received signal level can be as low as the thermal noise floor. Because the user equipment 106 is very close to the eNodeB 108, the synchronization channel might be received as a very strong signal, for example −25 dBm. In this scenario, the AGC of the user equipment will face the challenge of settling to an appropriate low RF gain level for reception of the SCH channel shortly after having settled to a very high RF gain level in the uplink region.
A similar problem (i.e. settling to a usable radio frequency gain) exists in LTE Frequency Division Duplex (FDD) networks. For example, in a MBMS Single Frequency Network (MBSFN), the data load can be varying in MBSFN regions of the downlink signal. All symbols in a subframe but the first (which contains a CRS pilot) can be empty. For example, if the wanted downlink signal is very strong (e.g. −25 dBm) and the gain control has settled to a very high value because there is no signal received in the MBSFN region, then the gain will have to change from the very high value to a very low value during only a few AGC iterations.
Based on the above, there is a need for a solution that would solve or at least mitigate the above problems or drawbacks.