Long Term Evolution (LTE) is an improved universal mobile telecommunication system (UMTS) that provides higher data rate, lower latency and improved system capacity. In LTE systems, an evolved universal terrestrial radio access network includes a plurality of base stations, referred as evolved Node-Bs (eNBs), communicating with a plurality of mobile stations, referred as user equipment (UE). A UE may communicate with a base station or an eNB via the downlink and uplink. The downlink (DL) refers to the communication from the base station to the UE. The uplink (UL) refers to the communication from the UE to the base station. LTE is commonly marketed as 4G LTE, and the LTE standard is developed by 3GPP.
In a wireless cellular communications system, multiuser multiple-input multiple-output (MU-MIMO) is a promising technique to significantly increase the cell capacity. In MU-MIMO, the signals intended to different users are simultaneously transmitted with orthogonal (or quasi-orthogonal) precoders. On top of that, the concept of a joint optimization of MU operation from both transmitter and receiver's perspective has the potential to further improve MU system capacity even if the transmission and precoding is non-orthogonal. For example, the simultaneous transmission of a large number of non-orthogonal beams/layers with the possibility of more than one layer of data transmission in a beam. Such non-orthogonal transmission could allow multiple users to share the same resource elements without spatial separation, and allow improving the multiuser system capacity for networks with a small number of transmit antennas (i.e. 2 or 4, or even 1), where MU-MIMO based on spatial multiplexing is typically limited by wide beamwidth.
An example of such joint Tx/Rx optimization associated with adaptive Tx power allocation and codeword level interference cancellation (CW-IC) receiver is recently a remarkable technical trend, including non-orthogonal multiple access (NOMA) and other schemes based on downlink multiuser superposition transmission (MUST). In MUST, the signals intended for two users are superposed and occupy the same time-frequency radio resource. To benefit from MUST, the two co-scheduled users generally need to have a large difference in the received signal quality, e.g., in terms of the received signal-to-interference-plus-noise ratio (SINR). In a typical scenario, one of the users is geometrically close to the base station, and the other user is geometrically far away from the base station. The former user and the latter user are also referred to as the near-user and far-user respectively.
In order to apply MUST precoding, the transmitting station is required to know the Channel State Information (CSI) of the radio channels connecting it to each of the receiving stations for transmission. In 3GPP LTE systems, it is common for the receiving stations (e.g., UEs) to measure CSI and report CSI to the transmitting station (e.g., eNB) via an uplink feedback channel. The content of CSI feedback contains RI (rank indicator), CQI (channel quality indicator), and PMI (precoding matrix indicator) for each downlink channel.
In the current LTE communication system, the UE determines the CQIs based on the output SINRs of an MMSE receiver. However, the feedback SINRs may not be the same as the actual SINRs of the UE. In a first scenario, when UE reports RI=1, but there are two spatial layers in the actual transmission. In a second scenario, when UE reports RI=2 with certain PMI, but the eNB uses a different PMI for MU-MIMO transmission. As a result, the CSI feedback received by the eNB does not reflect the actual channel state information of the UE, causing the eNB unable to perform MUST precoding effectively.
A solution is sought.