Long Term Evolution (LTE for short) is long term evolution of a Universal Mobile Telecommunications System (UMTS) technology standard specified by the 3rd Generation Partnership Project (3GPP for short) organization. With key technologies such as orthogonal frequency division multiplexing (OFDM for short) and multiple-input multiple-output (MIMO for short) introduced, Long Term Evolution significantly increases spectral efficiency and a data transmission rate, and therefore is widely developed in recent years.
An LTE system is based on orthogonal frequency division multiple access (OFDMA for short) and single carrier frequency division multiple access (SC-FDMA for short) in the downlink and the uplink respectively. A time-frequency resource is divided into an OFDM or SC-FDMA symbol (hereinafter referred to as a time-domain symbol) in a time domain dimension and a subcarrier in a frequency domain dimension. A smallest resource granularity is referred to as a resource element (RE for short), which represents a time-frequency grid formed by one time-domain symbol in a time domain and one subcarrier in a frequency domain. In the LTE system, service transmission is based on scheduling by a base station (eNB for short), a basic time unit of scheduling is one subframe, and one subframe includes multiple time-domain symbols. A specific scheduling process is as follows: The base station sends a control channel, such as a physical downlink control channel (PDCCH for short) or an enhanced physical downlink control channel (EPDCCH for short). The control channel may carry scheduling information of a physical downlink shared channel (PDSCH for short) or a physical uplink shared channel (PUSCH for short), and the scheduling information includes control information such as resource allocation information or a modulation and coding scheme. User equipment (UE for short) detects the control channel in a subframe, and receives a downlink data channel or sends an uplink data channel according to the scheduling information carried by the detected control channel.
LTE supports two duplex modes: frequency division duplex (FDD for short) and time division duplex (TDD for short). For FDD, downlink and uplink transmission is performed on different carriers. For a TDD system, uplink and downlink transmission is performed at different time of a same carrier. Specifically, one carrier includes a downlink subframe, an uplink subframe, and a special subframe. LTE currently supports seven different TDD uplink-downlink configurations.
In LTE, a hybrid automatic repeat request (HARQ for short) mechanism is used to implement a function of error detection and correction. For example, in the downlink, after the UE receives a PDSCH, if the PDSCH is correctly received, the UE feeds back an acknowledgment (ACK for short) on a physical uplink control channel (PUCCH); or if the PDSCH is incorrectly received, the UE feeds back a negative acknowledgment (NACK for short) on a PUCCH. LTE further supports a carrier aggregation (CA for short) technology, that is, the base station configures multiple carriers for one UE to improve a data rate of the UE. During CA, multiple carriers sent by the base station are synchronously sent in terms of time. The UE can separately detect a PDCCH that schedules each carrier and a corresponding PDSCH. A specific detection process for each carrier is similar to that for the foregoing single carrier. The LTE system supports FDD CA, TDD CA, and FDD+TDD CA. TDD CA is further classified into TDD CA with a same uplink-downlink configuration and TDD CA with different uplink-downlink configurations. In a CA mode, there is one primary component carrier and at least one secondary component carrier, and a PUCCH that carries an ACK/NACK is sent only on the primary component carrier of the UE. When HARQ-ACKs of multiple downlink carriers are transmitted on one PUCCH channel or PUSCH channel, joint coding is generally used. In the LTE system, uplink control signaling mainly has two coding manners. One is linear block code such as Reed Muller (RM for short), and the other is convolutional code. Regardless of which coding manner is used, in a general decoding manner, before correctly performing decoding, the base station needs to know a total original information bit quantity for joint coding performed by the user equipment.
Generally, the UE calculates a total original information bit quantity for HARQ-ACK joint coding according to a quantity of detected PDSCHs on downlink carriers. In this case, once the UE misses a PDSCH on a downlink carrier during detection, a quantity, understood by the UE, of carriers having PDSCHs is less than an actual quantity of carriers, of an eNB, that send PDSCHs. The UE feeds back an HARQ-ACK for a PDSCH detected by the UE itself. However, the base station cannot correctly decode the HARQ-ACK fed back by the UE.