In an existing wireless communications network (such as a 2G, 3G, or 4G network, where G is short for generation), operating frequency bands of communications systems are all in a frequency range below 6 GHz. Available operating frequency bands in this frequency range are strained, and an increasing communication requirement cannot be met. On the other hand, there are a large quantity of frequency bands not fully utilized in a frequency range above 6 GHz. Therefore, a future (for example, 5G) wireless communications network whose operating frequency band is above 6 GHz is under research and development in the industry, to provide an ultrafast data communications service. In the frequency range above 6 GHz, frequency bands at 28 GHz, 39 GHz, 60 GHz, 73 GHz, and the like are available for a next-generation wireless communications network. Because an operating frequency band of the next-generation wireless communications network is above 6 GHz, the next-generation wireless communications network has significant features of a high-frequency communications system, such as a high-bandwidth and highly integrated antenna array, and a relatively high throughput is easy to achieve. However, compared with the existing wireless communications network, the next-generation wireless communications network operating in a range above 6 GHz is subject to severer intermediate radio frequency distortion, especially impact caused by phase noise (PHN). In addition, a Doppler effect and a central frequency offset (CFO) have greater impact on performance of the high-frequency communications system as a frequency band increases. The phase noise, the Doppler effect, and the CFO have one thing in common, that is, a phase error is introduced into data reception in the high-frequency communications system, and therefore the high-frequency communications system is degraded in performance or even cannot operate.
Using the phase noise as an example, with an increase in a frequency band, a phase noise level increases by 20*log(f1/f2)dB. Using a 2 GHz frequency band and a 28 GHz frequency band as an example, a phase noise level of the 28 GHz frequency band is 23 dB higher than that of the 2 GHz frequency band. A higher phase noise level causes a larger phase error, and then has greater impact on a signal.
A reference signal is a to-be-sent signal to which a known pilot symbol is added by a transmit end, and a receive end performs a specific function based on information about the known pilot symbol. A most common method for phase noise estimation is estimating a phase error by using an inserted phase tracking reference signal (PTRS).
Because of physical features of the phase noise, PTRS design usually has the following features:
The phase noise randomly changes in time, and a coherence time is relatively short. In the phase tracking reference signal design, the coherence time may be understood as a quantity of consecutive orthogonal frequency division multiplexing (OFDM) symbols that have same phase noise. Therefore, a reference signal for phase noise estimation usually needs to have relatively high time-domain density. In addition, there are different requirements for time-domain density of the PTRS under different transmission conditions.
The phase noise is generated due to non-ideality of a local oscillator. Different antenna ports that have a same local oscillator have same phase noise. Demodulation reference signal (DMRS) antenna ports physically connected to a same local oscillator have same phase noise. One antenna port corresponds to one DMRS port. Therefore, only one PTRS antenna port needs to be configured for the plurality of DMRS antenna ports that have the same local oscillator, to carry a PTRS. Phase noise on this group of DMRS antenna ports can be estimated by using the PTRS sent on the PTRS antenna port.
With development of communications technologies, a plurality of local oscillators may be used to form an antenna port set. This means that different PTRSs need to be used. How to select a proper PTRS becomes a new task.