Radio link monitoring (RLM), which is used by a user equipment (UE) to assess serving cell performance, is defined assuming a dual UE receiver implementation. The corresponding predefined and configurable parameters associated with RLM are used regardless of the UE receiver implementation and capability. However, UEs with a single receiver or with more than two receivers are expected to become available. The existing RLM procedures and parameter settings applied to these new UE receiver capabilities may incorrectly assess and detect the radio link problem.
A UE performs measurements on a serving cell (e.g., primary cell) in order to monitor the serving cell performance. This is called radio link monitoring (RLM) or RLM related measurements in LTE. For RLM, the UE monitors the downlink link quality based on the cell-specific reference signal in order to detect the downlink radio link quality of the serving or PCell.
In order to detect an out of sync (OOS) status and an in sync (IS) status, the UE compares the estimated DL signal quality of the serving cell with the thresholds Qout and Qin, respectively. The threshold Qout and Qin are defined as the level at which the downlink radio link cannot be reliably received and corresponds to 10% and 2% block error rate, respectively, of a hypothetical PDCCH transmissions. In non-DRX, downlink link quality for out of sync and in sync are estimated over evaluation periods of 200 ms and 100 ms respectively. In DRX, downlink link quality for out of sync and in sync are estimated over the same evaluation period, which scale with the DRX cycle. In addition to filtering on a physical layer (i.e. evaluation period), the UE also applies higher layer filtering based on network configured parameters. This increases the reliability of radio link failure detection (RLM) and thus, avoids unnecessary radio link failure and consequently RRC re-establishment. The UE declares RLF after the detection of a certain number of consecutive OOS and an expiration of a RLF timer.
A UE is required to meet the RLM (i.e. OOS and IS quality targets) provided the transmission parameters of PDCCH/PCFICH for OOS and IS detections are according to table 1 (FIG. 1) and table 2 (FIG. 2), respectively.
In LTE, the baseline UE receiver (interchangeably referred to as a radio receiver or radio chain or IFFT/FFT) is a dual receiver (e.g., 2-way receiver diversity). The UE requirements including RLM are defined assuming a dual receiver. In later releases (release 11 and onwards), more complex UE receivers (e.g., enhanced receiver) that can mitigate inter-cell interference are also introduced; but they still rely on dual receiver baseline architecture. The terms interference mitigation (IM) receiver, interference cancellation (IC) receiver, interference suppression receiver, interference rejection receiver, interference aware receiver, interference avoidance receiver, etc., are interchangeably used, but they all belong to a category of an advanced receiver or an enhanced receiver. The inter-cell interference mitigation refers to the receiver ability to mitigate the interference caused by at least certain signals received at the UE receiver from at least one interfering cell (e.g., aggressor cell). In release 12 or later, the UE requirements for single receiver and also for more than two receivers (e.g. 4-way receiver diversity) could be introduced. Examples of well-known receiver types are IC/IM, MMSE, MMSE-IRC, maximum likelihood (ML), successive interference cancellation (SIC), parallel interference cancellation (PIC) receiver or any combination thereof, etc. Examples of receiver types in terms of their ability to mitigate specific types of interfering signals are CRS-IM, PSS/SSS IC, PBCH IC, PDCCH IC, PDSCH IC receivers, etc.
Machine-to-machine (M2M) communication (e.g., machine type communication (MTC)) is used for establishing communication between machines and between machines and humans. This communication may comprise exchange of data, signaling, measurement data, configuration information, etc. The device size may vary from that of a wallet to that of a base station. The M2M devices are quite often used for applications like sensing environmental conditions (e.g. temperature reading), metering or measurement (e.g. electricity usage etc.), fault finding or error detection etc. In these applications, the M2M devices are seldom active, but over a consecutive duration depending upon the type of service, e.g., about 200 ms once every 2 seconds, about 500 ms every 60 minutes, etc. The M2M device may also perform measurements on other frequencies or other RATs.
One category of M2M devices is referred to as a low cost device category. For example, the cost reduction can be realized by having just a single receiver in the UE. The cost can be further reduced by having single receiver and half duplex FDD capability. The latter feature prevents the need for having duplex filter since UE does not transmit and receive at the same time. A low cost UE may also implement additional low cost features like, a downlink and an uplink maximum transport block size of (TBS) of 1000 bits and a reduced downlink channel bandwidth of 1.4 MHz for a data channel (e.g. PDSCH). For example, a low cost UE may comprise of a single receiver and one or more of the following additional features, HD-FDD, the downlink and/or uplink maximum transport block size of (TBS) of 1000 bits and the reduced downlink channel bandwidth of 1.4 MHz for data channel.
Another category of M2M devices is required to support enhanced UL and/or DL coverage. These devices are installed at locations where path loss between M2M device and the base station can be very large such as when used as a sensor or metering device located in a remote location such as a basement of a building. In such scenarios, the reception of a signal from a base station is very challenging. For example, the path loss can be worse than 15-20 dB compared to normal operation. In order to cope with such challenges, the coverage in an uplink direction and/or in a downlink direction has to be substantially enhanced. This can be realized by employing one or plurality of advanced techniques in the UE and/or in a radio network node for enhancing the coverage, e.g., boosting of DL transmit power, boosting of UL transmit power, enhanced UE receiver, signal repetition, etc.
The UE and a serving network node of the UE needs to comply with the radio link monitoring (RLM) requirements, which are predefined in the standard. These requirements are derived assuming that the UE has a dual receiver (aka 2-way receiver diversity). The RLM performance heavily affects the cell coverage. The network planning (e.g., cell size, distance between base station sites, etc.) also relies on the RLM performance in addition to other factors (e.g., base station power class). One type of low cost M2M devices comprises of a UE with a single receiver. The current RLM procedures and associated parameters used for dual UE receiver implementation may degrade the assessment of the serving cell performance. For example, the UE may lose its serving cell coverage but may not find another stronger cell, thereby getting lost in a coverage hole or dead cell zone. Similarly, the current RLM settings may not be appropriate for the high end UE having more than two receivers as these settings may cause the UE to over estimate serving cell performance. The problem is further accentuated when such a high end UE autonomously changes or is configured to adapt the number of receivers for actually receiving signals. In this case, the RLM behavior of the UE is unclear and the ambiguity may lead to serving cell performance degradation.