In the Evolved Universal Terrestrial Radio Access Network (E-UTRAN) in the third generation mobile communication Long Term Evolution (LTE) system, the uplink data is transmitted through the uplink shared channel. The Evolved NodeB (eNB) allocates resources to each User Equipment (UE). The access technology applied by the E-UTRAN is orthogonal frequency division multiplexing (OFDM) technology, and compared with the second generation mobile communication system, the radio resource management of the E-UTRAN system has features such as large bandwidth and multiple time processes, and its radio resources have two-dimensional attributes of time and frequency, thus the number of users born in the resources significantly increases.
In order to allocate resources and provide service for each UE in accordance with its own need so as to achieve better multiplexing performance in the uplink transmission, and also in order to fully flexibly and effectively use the system bandwidth, the LTE system formulates a dedicated control message for the allocation of the user's uplink transmission resource. Wherein, the control message dedicated to allocating the resources in the Physical Uplink Shared Channel (PUSCH) is sent by the eNB to the UE, and the resource allocation control message is also called UpLink Grant (UL Grant), and the UL Grant is sent in the Physical Downlink Control Channel (PDCCH), and the Downlink Control information format 0 (DCI format 0) is applied to send.
The basic process of uplink radio resource scheduling is that: the eNB judges which frequency resources are transmitted appropriately to the UE according to the radio channel conditions of uplink of the target UE, and indicates the frequency positions of the radio resources in the aforementioned control signaling, and distributes the control signaling to the UE, and after the UE receives the control signaling and demodulates out the resource information, thus it acquires the scheduling of the frequency resources of the UE by the eNB, and upload the service data in the designated frequency resource positions.
In the LTE protocol, a part of important definition domains included in the DCI format 0 are as follows:
Frequency hopping indicator: used to indicate whether the frequency resources assigned by the signaling use the frequency-hopping mode or not;
Resource block assignment information: used to indicate the positions and sizes of the frequency resources used by the UE;
Modulation and Coding Scheme (MCS): used to indicate which modulation mode and coding rate should be applied in the data transmitted by the UE;
New data indicator: used to indicate the UE whether to send new data or retransmit old data;
Wherein, the resource block assignment information is the definition domains used by the foregoing eNB to notify UE which frequency resources can be used to bear the uplink data.
In the current LTE protocol, only the “resource block assignment information” and “modulation and coding scheme” in the signaling information acquired by UE from the eNB are used to determine how to allocate the uplink data to which frequency resources.
In addition, the Radio Resource Control (RRC) layer in the LTE system sends a RRC message to establish a RRC layer link between the UE and the eNB, to configure system parameters and to transfer UE capability parameters and so on. Wherein, the downlink RRC message is transmitted on the Physical Downlink shared Channel (PDSCH).
The current LTE system supports multiple types of communication services, such as Voice over Internet Protocol (VoIP) services, file transfer protocol (FTP) data services, Hypertext Transfer Protocol (HTTP) Internet services and online game services and so on. The quality of service (QoS) parameters required by these services are different, and at present, the QOS parameters mainly comprise the maximum allowable packet delay, the minimum allowable data rate, the maximum allowable packet error rate, Prioritized Bit Rate (PRB), the Aggregate Maximum Bit Rate (AMBR) and so on. In the media access control (MAC) layer of the UE, different types of service data are mapped to multiple logical channels (LCH), and according to different QoS of the services, they are divided into four logical channel groups (LCG) respectively according to the priority of the logical channels, then the UE reports the data amount of these four LCGs to the eNB in the form of Buffer Status Report (BSR), and the eNB allocates frequency resources to the UE after comprehensively considering the uplink channel quality of the UE, BSR, the QoS requirement of each service in the UE.
The uplink in the existing LTE system Release 8 uses the transmission technology of the Single Carrier-Frequency Division Multiple Access (SC-FDMA), which has the limitation that the frequency resources assigned for the UE must be continuous when the eNB carries out the uplink frequency resources scheduling. After the UE receives the assignment signaling of frequency resources, if the UE has the data of multiple LCHs (which means the UE starts up multiple types of services), the UE allocates the resources to each logical channel according to the following steps:
allocate the radio resources to logical channels that have data to transmit while the average data flow does not satisfy the PRB requirements of the logical channels according to the descending order of the priorities of the logical channels;
If there are remaining resources, allocate the remaining resources in turn to the logical channels that have data to transmit according to the descending order of the priorities of the logical channels. The logical channels with the same priority enjoy the equal opportunities to allocate radio resources.
Moreover, the UE should follows the following principles when executing the abovementioned steps:
when one Radio Link Control Service Data Unit (RLC SDU) in the LCH data to be transmitted can be completely born in the remaining resources, the UE should not segment the RLC SDU to transmit (that is, it should not only allocate resources for a part of data of the SDU in this resource allocation);
if the UE segments one RLC SDU, it should use the remaining resources to bear a segment of data in the RLC SDU as large as possible;
the UE should maximize the amount of data transmission.
Wherein, the principle for formulating the LCH priorities are different according to different requirements of operators, for example in some cells, the operators might require that the VoIP service should have the highest priority, while in other cells, operators might require that the internet services should have the highest priority. When the UE starts up multiple types of services, in order to ensure QoS requirements of various services are satisfied, the eNB must select the appropriate frequency resources in accordance with the requirement of the service that is most strict in QoS, which in fact wastes a certain quantity of frequency resources since the maximum packet error rates (PER) required by different types of services are different, so as the channel quality of the required frequency resources, and if the UE places the data of services with low requirements of channel quality in the frequency resources of high quality, the resources are wasted.
In the principle defined in the current LTE system on how the UE places the logical channel data into the frequency resources (refer to the above description), it lacks the principle of “select the appropriate logical channel data and put it into the frequency resources according to the difference of channel quality”, thus the UE only allocates the resources stiffly to the logical channel data with the highest priority.
For the current LTE system, since the resources bearing the UL grant are limited, although the PUSCH resource utilization can be improved if more control information is added in the UL grant for the uplink scheduling, it will bring larger control signaling overhead, which will cancel most of the SCH-channel transmission gain. In addition, although the existing RRC message can be utilized or a new MAC control element (MAC CE) can be added to bear a part of control information, they also have the problem that the control information overhead will be increased. Therefore, although the current uplink scheduling mechanism in the LTE system has shortcomings, the performance is still acceptable.
According to the definition in the protocol standards of the current LTE MAC layer (3Gpp TS36.321), the foregoing MAC control unit is a part of the MAC layer protocol data unit (MAC PDU), and one MAC PDU is composed of one MAC header, zero or more MAC control units, zero or more MAC Service Data Units (SDU) and optional padding data.
Since the formulation of the LTE release 8 standards is close to completion, in order to adapt to the current and future rapid growth in demand for various radio services, the next evolution standard of the LTE release 8, which is LTE-Advanced standard, has already entered into the formulation process.
The LTE-Advanced system uses the carrier aggregation technology, and the carriers participating in the aggregation are called component carriers, the UE can simultaneously carry out transmission and reception with the eNB in multiple carrier frequency bands, and in a single carrier frequency band, the LTE release8 features still remain, that is, the LTE-Advanced system can be seen as the “binding” of multiple LTE systems. After introducing the carrier aggregation technology, the available resources in the LTE-Advanced systems are greatly expanded, and the flexibility of uplink scheduling is improved greatly. In the LTE system, the eNB can only allocate continuous frequency spectrum resources to the UE in the frequency domain, thus the scheduling flexibility is considerably limited, and after multiple component carriers are aggregated, the eNB can allocate resources to the UE in each component carrier frequency band, which is equivalent to introducing the feature of distributed scheduling in frequency domain, thus both the scheduling flexibility and the frequency domain diversity gain improve significantly. For example, the eNB might schedule different service data in different component carrier frequency bands, and the channel conditions of different component carrier frequency bands are different (different frequencies will lead to different channel conditions such as fast fading, slow fading, Doppler shift and so on), and the eNB allocates the component carrier resource whose channel quality is just right to the corresponding service according to the QoS requirements of different types of services, thus there is no chance that the service with low requirements of channel quality actually obtains frequency resource of high channel quality while the data of the service with high requirements of channel quality cannot obtain the best frequency resources, and the communication quality of this service will be affected and the system throughput is decreased.
Although the eNB can flexibly schedule the resources after the carrier aggregation technology is introduced, at present, the information directly related to resource allocation in the resource assignment control signaling sent by the eNB to the UE is only the resource block assignment information and the MCS, from which the UE cannot determine that the eNB hopes to allocate the resources of which component carrier to which logical channel. According to the UE handling mechanism defined in the current LTE release 8 standard, the UE only allocate the resources of the first component carrier assigned by the eNB to the logical channel with the highest priority, and then allocate the resources of the second component carrier until the resources of the first component carrier have been allocated. This indicates that the scheduling intention of the eNB is not executed by the UE at all, which will decrease the system throughput.
In the LTE system, since the uplink resources are limited and the uplink scheduling is not so flexible, for the method that the UE side uses a simple logical channel priority handling mechanism to use the resources assigned by the eNB, the performance is acceptable. However, in the LTE-Advanced system, since the carrier aggregation technology provides large uplink scheduling flexibility and largely expends the system overall bandwidth, the performance loss caused by the aforementioned shortcomings is very big in the LTE-Advanced systems.