Mobile communication systems were developed to ensure user activity and provide voice service. However, mobile communication systems have extended their range to data service as well as voice, and currently the explosive increase in traffic is causing a lack of resources and there is a users' demand for faster services, which is creating a need for advanced mobile communication systems.
The requirements for next-generation mobile communication systems largely include coping with explosive data traffic, very high data rates per user, coping with a surprisingly large number of connected devices, very low end-to-end latency, and support for high energy efficiency. To this end, research is ongoing on a variety of technologies such as dual connectivity, massive MIMO (massive multiple input multiple output), in-band full duplex, NOMA (non-orthogonal multiple access), support for super wideband, and device networking.
In particular, the key measures of performance for 5G mobile communication are perceived data rates, peak data rates, peak transfer rates, transmission latency, device density, energy efficiency, frequency efficiency, system capacity per unit area, and so on.
Among them, one of the most prominent features of 5G mobile communication systems that distinguish them from the existing 4G mobile communication systems is the low latency requirement.
This requirement is for supporting various services like smart grid, vehicle-to-vehicle communication, and virtual reality, which are considered for use in 5G mobile communication systems, and these services generally require very low latency.
For this reason, it is difficult to support these services with the transmission latency for the existing mobile communication systems.
While most of the research conducted on the existing 4G mobile communication systems has focused on improving peak data rates on UE, much of the recent research on 5G mobile communication systems focuses on meeting the low transmission latency requirement.
In the existing LTE/LTE-A communication technology, when a UE in the RRC-Idle state performs an uplink transmission, it is necessary to perform an RRC connection procedure to transition to the RRC_connected state.
However, the RRC connection procedure requires ten or more exchanges of control messages between a UE and a base station, which results in a minimum latency of 50 ms or so.
Thus, the RRC connection procedure is a major bottleneck in meeting the low latency requirement for the 5G mobile communication technology.
In particular, it is expected that, while services like smart grid and vehicle-to-vehicle communication, which can benefit from low latency services provided by the 5G mobile communication technology, require low transmission latency, the UE is highly likely to be in the RRC_Idle state in uplink transmissions. Therefore, the RRC connection procedure is a major problem in achieving the low latency requirement for these services.
The RRC protocol is a protocol used in LTE/LTE-A mobile communication technology that exists between the UE and the base station in order to exchange basic control information needed for the UE to access a mobile communication system.
The UE exchanges necessary information with the base station to perform communication and establishes an RRC connection, in which case the UE is defined as being in the RRC_Connected state.
However, if there is no communication between the UE and the base station for a certain period of time for management of the UE's power use, the connection is released and the UE is transitioned to the RRC_idle state and maintains the RRC_Idle state until an uplink or downlink transmission occurs on the UE.
In the existing LTE/LTE-A communication technology, it is necessary for the UE in the RRC_idle state to re-establish an RRC connection in order to send data, and it takes about 50 ms to establish an RRC connection and send data.
In services like vehicle-to-vehicle communication, smart grid communication, and IoT devices' interrupt messages, it's not possible to predict when packets will be generated, and low latency is required, which makes it difficult to adopt the existing communication technology.
Moreover, the RRC connection procedure requires at least 9 RRC message exchanges between the UE and the base station. Thus, performing the RRC connection procedure each time when sending small data like sensor information leads to inefficient use of resources in terms of signaling overhead.
Therefore, in order to meet the aforementioned service requirements and make efficient use of resources, the overhead and latency needed for transmission should be reduced by simplifying the RRC connection procedure for uplink transmission.
Additionally, in the existing LTE/LTE-A communication technology, it is necessary for the UE to receive downlink control information (DCI) the base station sends for downlink transmission.
The DCI is sent from the base station only when the UE is scheduled for downlink, and the downlink control information contains the positions of resources in a corresponding subframe the UE has to receive.
Thus, latency can be minimized if the UE is able to continuously receive downlink control information in every subframe.
However, the UE's continuous reception of control information in every subframe leads to the problem of very high power consumption.
Accordingly, the existing LTE/LTE-A communication technology employs discontinuous reception (DRX) technology which can save reception power for the UE to receive downlink control information by receiving downlink control information at regular intervals, rather than continuously receiving it.
However, the drawback of the DRX technology is that, if the UE receives a downlink packet, the base station has to perform scheduling according to the next downlink reception cycle of the corresponding UE, which increases latency.
Therefore, in order to meet the low latency requirement for the 5G mobile communication technology, a technology is required to realize continuous reception of downlink control information, as well as reducing the power consumed when the UE receives downlink control information.
In the existing LTE/LTE-A communication technology, each UE performs blind-decoding to check for the presence or absence of downlink control information in a downlink control channel, in order to determine whether it is scheduled for downlink.
If there is no information that is successfully decoded, the UE determines that it is not scheduled for the corresponding subframe. If there is information that is successfully decoded, the UE receives downlink data from the base station by using the successfully decoded information.
Downlink control information contains the positions of resources for downlink data the UE has to receive in a subframe in which the corresponding control information has been transmitted and the method of transmission. Thus, upon receiving downlink control information from the base station, the UE always needs to receive the entire corresponding subframe and store it in a buffer for downlink data reception.
This leads to high power consumption for the UE to receive the downlink control information, resulting in a failure to continuously receive the downlink control information.
Furthermore, it is expected that the size of data sent in a low latency service is generally small, so the signaling overhead will be too heavy due to the size of DCI in the existing communication technology compared to the size of data that is actually sent.