When there is relative movement between a transmitter and a receiver in a communication system, there is a shift of the frequency of a signal received by the receiver from the frequency of the signal transmitted by the transmitter, where the shift is referred to a Doppler shift, and calculated in the equation of
            Δ      ⁢                          ⁢      f        =                            f          ·          v                c            ⁢      cos      ⁢                          ⁢      θ        ,where c is the velocity of light, v is a speed of relative movement between the receiver and the transmitter, and θ is an angle between a movement direction of the receiver and a propagation direction of the signal.
In a Long Term Evolution (LTE) system, in a scenario where a UE is moving at a high speed, there is a large absolute value of a Doppler shift due to a high LTE carrier frequency, and for example, in a scenario of a high-speed train including an LTE UE, the carrier frequency is 2.6 GHz, and the UE moves at 300 km/h, so there is such a significant Doppler shift of 720 Hz that the performance of the receiver is sharply degraded, the throughput of the network drops, and even the UE is hindered from accessing the network. In a scenario of high speed, a UE is moving at a high speed, in order to avoid the performance from being sharply degraded due to frequent switching, cells are typically merged, that is, a plurality of Radio Remote Units (RRUs) distributed along a highway or a railway are concatenated into a logic cell to thereby extend the coverage radius of the cell. In this case, when the UE moves from the coverage area of one RRU to the coverage area of another RRU, received signals of the two RRUs propagate in opposite direction, so there are opposite Doppler shifts, thus degrading the performance of the UE to receive a downlink signal. The amplitudes of the signals of the two RRUs received by the UE particularly at a place where the signals overlap are so approximate that the UE can neither make the frequency of a local oscillator agree with the frequencies of the received signals through Automatic Frequency Control (AFC), nor distinguish the signals of the two RRUs from each other, and pre-correct them respectively, thus seriously degrading the performance of receiving the downlink signal.
In the prior art, since an eNB may transmit a Dedicated Reference Signal to a UE in some special transmission mode, the eNB can instruct the UE via Radio Resource Control (RRC) interaction signaling (e.g., an RRC Reconfigure message) to transmit a signal in the specific transmission mode. In such a specific transmission mode, the eNB only pre-corrects a DRS and a Physical Downlink Shared Channel (PDSCH) to be transmitted to the UE, but will not pre-correct a Cell-specific Reference Signal (CRS), and the UE obtains a corresponding downlink signal frequency shift from the received DRS, and subsequently pre-corrects the pre-corrected downlink signal according to the obtained downlink signal frequency shift upon reception of the downlink signal.
However the existing solution to transmitting a DRS in a specific mode is a solution to pre-estimating and compensating for a downlink frequency shift in a high-speed scenario starting with the protocol layer, and in this solution, an eNB shall configure a UE with a special transmission mode via RRC signaling, thus resulting in a signaling burden, and downlink pre-correction and correction shall be performed by the network side and the UE side in cooperation, so both the eNB and the UE shall support this solution; and in the solution to transmitting a DRS in a specific mode, when there is sparse traffic of the UE, a DRS signal shall be transmitted periodically to the UE for downlink scheduling with the UE, thus resulting in unnecessary scheduling.
Accordingly it is desirable to provide a new method and device for pre-correcting a downlink signal so as to make up the drawback of the existing solution to pre-correcting a downlink signal.