The 3GPP initiative “License Assisted Access” (LAA) intends to allow LTE (Long Term Evolution) equipment to also operate in the unlicensed radio spectrum such as the 5 GHz band. The unlicensed spectrum is used as a complement to the licensed spectrum. Accordingly, User Equipment (UE) devices connect in the licensed spectrum (primary cell or PCell) and use carrier aggregation to benefit from additional transmission capacity in the unlicensed spectrum (secondary cell or SCell). To reduce the changes required for aggregating licensed and unlicensed spectrum, the LTE frame timing in the primary cell is simultaneously used in the secondary cell.
In addition to LAA operation, it should be possible to run LTE fully on the unlicensed band without the support from the licensed band. This is called LTE-U Stand Alone. Another variant of the LTE-U Stand Alone is further standardized in the MultiFire Alliance.
Regulatory requirements, however, may not permit transmissions in the unlicensed spectrum without prior channel sensing. Since the unlicensed spectrum must be shared with other radio devices of similar or dissimilar wireless technologies, a so called listen-before-talk (LBT) method needs to be applied. Today, the unlicensed 5 GHz spectrum is mainly used by equipment implementing the IEEE 802.11 Wireless Local Area Network (WLAN) standard. This standard is known under its marketing brand “Wi-Fi.”
The LBT procedure leads to uncertainty at the base station or node (eNB) regarding whether it will be able to transmit (a) DownLink (DL) subframe(s) or not. This leads to a corresponding uncertainty at the UE as to whether it actually has a subframe to decode or not. An analogous uncertainty exists in the UpLink (UL) direction where the eNB is uncertain if the UEs scheduled on the SCell actually made a transmission or not.
LTE uses OFDM (Orthogonal Frequency Division Multiplexing) in the downlink and (Discrete Fourier Transform) DFT-spread OFDM (also referred to as single-carrier Frequency Division Multiple Access, FDMA) in the uplink. The basic LTE downlink physical resource can thus be seen as a time-frequency grid as illustrated in FIG. 1, where each resource element 110 corresponds to one OFDM subcarrier during one OFDM symbol interval. The uplink subframe has the same subcarrier spacing as the downlink and the same number of single carrier frequency division multiple access (SC-FDMA) symbols in the time domain as OFDM symbols in the downlink. Each OFDM symbol 110 comprises a cycling prefix 120.
In the time domain, LTE downlink transmissions are organized into radio frames of 10 ms, each radio frame consisting of ten equally-sized subframes of length Tsubframe=1 ms as shown in FIG. 2. For normal cyclic prefix, one subframe consists of 14 OFDM symbols. The duration of each symbol is approximately 71.4 μs.
Furthermore, the resource allocation in LTE is typically described in terms of resource blocks, where a resource block corresponds to one slot (0.5 ms) in the time domain and 12 contiguous subcarriers in the frequency domain. A pair of two adjacent resource blocks in time direction (1.0 ms) is known as a resource block pair. Resource blocks are numbered in the frequency domain, starting with 0 from one end of the system bandwidth.
Downlink transmissions are dynamically scheduled, i.e., in each subframe the base station transmits control information about which terminal(s) data is transmitted to and upon which resource blocks the data is transmitted, in the current downlink subframe. This control signaling is typically transmitted in the first 1, 2, 3 or 4 OFDM symbols in each subframe and the number n=1, 2, 3 or 4 is known as the Control Format Indicator (CFI). The downlink subframe also contains common reference symbols, which are known to the receiver and used for coherent demodulation of e.g. the control information. A downlink system with CFI=3 OFDM symbols as control is illustrated in FIG. 3
From LTE Rel-11 onwards, above described resource assignments can also be scheduled on the enhanced Physical Downlink Control Channel (ePDCCH). For Rel-8 to Rel-10 only Physical Downlink Control Channel (PDCCH) is available.
The reference symbols shown in the above FIG. 3 are the cell specific reference symbols (CRS) and are used to support multiple functions including fine time and frequency synchronization and channel estimation for certain transmission modes.
The PDCCH/ePDCCH is used to carry downlink control information (DCI) such as scheduling decisions and power-control commands. More specifically, the DCI includes:                Downlink scheduling assignments, including PDSCH resource indication, transport format, hybrid-ARQ information, and control information related to spatial multiplexing (if applicable). A downlink scheduling assignment also includes a command for power control of the PUCCH used for transmission of hybrid-ARQ acknowledgements in response to downlink scheduling assignments.        Uplink scheduling grants, including PUSCH resource indication, transport format, and hybrid-ARQ-related information. An uplink scheduling grant also includes a command for power control of the PUSCH.        Power-control commands for a set of terminals as a complement to the commands included in the scheduling assignments/grants.        
One PDCCH/ePDCCH carries one DCI message containing one of the groups of information listed above. As multiple terminals can be scheduled simultaneously, and each terminal can be scheduled on both downlink and uplink simultaneously, there must be a possibility to transmit multiple scheduling messages within each subframe. Each scheduling message is transmitted on separate PDCCH/ePDCCH resources, and consequently there are typically multiple simultaneous PDCCH/ePDCCH transmissions within each subframe in each cell. Furthermore, to support different radio-channel conditions, link adaptation can be used, where the code rate of the PDCCH/ePDCCH is selected by adapting the resource usage for the PDCCH/ePDCCH, to match the radio-channel conditions.
Here follows a discussion on the start symbol for PDSCH and ePDCCH within the subframe. The OFDM symbols in the first slot are numbered from 0 to 6. For transmissions modes 1-9, the starting OFDM symbol in the first slot of the subframe for ePDCCH can be configured by higher layer signaling and the same is used for the corresponding scheduled PDSCH. Both sets have the same ePDCCH starting symbol for these transmission modes. If not configured by higher layers, the start symbol for both PDSCH and ePDCCH is given by the CFI value signaled in PCFICH.
Multiple OFDM starting symbol candidates can be achieved by configuring the UE in transmission mode 10, by having multiple ePDCCH PRB configuration sets where for each set the starting OFDM symbol in the first slot in a subframe for ePDCCH can be configured by higher layers to be a value from {1,2,3,4}, independently for each ePDCCH set. If a set is not higher layer configured to have a fixed start symbol, then the ePDCCH start symbol for this set follows the CFI value received in PCFICH.
The LTE Rel-10 standard supports bandwidths larger than 20 MHz. One important requirement on LTE Rel-10 is to assure backward compatibility with LTE Rel-8. This should also include spectrum compatibility. That would imply that an LTE Rel-10 carrier, wider than 20 MHz, should appear as a number of LTE carriers to an LTE Rel-8 terminal. Each such carrier can be referred to as a Component Carrier (CC). In particular for early LTE Rel-10 deployments it can be expected that there will be a smaller number of LTE Rel-10-capable terminals compared to many LTE legacy terminals. Therefore, it is necessary to assure an efficient use of a wide carrier also for legacy terminals, i.e. that it is possible to implement carriers where legacy terminals can be scheduled in all parts of the wideband LTE Rel-10 carrier. The straightforward way to obtain this would be by means of Carrier Aggregation (CA). CA implies that an LTE Rel-10 terminal can receive multiple CC, where the CC have, or at least the possibility to have, the same structure as a Rel-8 carrier. CA is illustrated in FIG. 4. A CA-capable UE is assigned a primary cell (PCell) which is always activated, and one or more secondary cells (SCells) which may be activated or deactivated dynamically.
The number of aggregated CC as well as the bandwidth of the individual CC may be different for uplink and downlink. A symmetric configuration refers to the case where the number of CCs in downlink and uplink is the same whereas an asymmetric configuration refers to the case that the number of CCs is different. It is important to note that the number of CCs configured in a cell may be different from the number of CCs seen by a terminal: A terminal may for example support more downlink CCs than uplink CCs, even though the cell is configured with the same number of uplink and downlink CCs.
In addition, a key feature of carrier aggregation is the ability to perform cross-carrier scheduling. This mechanism allows a (e)PDCCH on one CC to schedule data transmissions on another CC by means of a 3-bit Carrier Indicator Field (CIF) inserted at the beginning of the (e)PDCCH messages. For data transmissions on a given CC, a UE expects to receive scheduling messages on the (e)PDCCH on just one CC—either the same CC, or a different CC via cross-carrier scheduling; this mapping from (e)PDCCH to PDSCH is also configured semi-statically.
This UE performs periodic cell search and RSRP and RSRQ measurements in RRC Connected mode. It is responsible for detecting new neighbor cells, and for tracking and monitoring already detected cells. The detected cells and the associated measurement values are reported to the network. Reports to the network can be configured to be periodic or aperiodic based a particular event.
To share the channel in the unlicensed spectrum, the LAA SCell cannot occupy the channel indefinitely. One of the mechanisms for interference avoidance and coordination among small cells is SCell ON/OFF feature. In Rel-12 LTE, discovery signals were introduced to provide enhanced support for SCell ON/OFF operations. Specifically, these signals are introduced to handle potentially severe interference situations (particularly on the synchronization signals) resulting from dense deployment as well as to reduce UE inter-frequency measurement complexity.
The discovery signals in a DRS (Discovery Reference Signal) occasion are comprised of the primary synchronization signal (PSS), secondary synchronization signal (SSS), common reference signal (CRS), and when configured, the channel state information reference signals (CSI-RS). The PSS and SSS are used for coarse synchronization, when needed, and for cell identification. The CRS is used for fine time and frequency estimation and tracking and may also be used for cell validation, i.e., to confirm the cell ID detected from the PSS and SSS. The CSI-RS is another signal that can be used in dense deployments for cell or transmission point identification. FIG. 5 shows the presence of these signals in a DRS occasion of length equal to two subframes and also shows the transmission of the signals over two different cells or transmission points.
The DRS occasion corresponding to transmissions from a particular cell may range in duration from one to five subframes for FDD and two to five subframes for TDD. The subframe in which the SSS occurs marks the starting subframe of the DRS occasion. This subframe is either subframe 0 or subframe 5 in both FDD and TDD. In TDD, the PSS appears in subframe 1 and subframe 6 while in FDD the PSS appears in the same subframe as the SSS. The CRS are transmitted in all downlink subframes and downlink pilot time slot (DwPTS) regions of special subframes.
The discovery signals should be useable by the UE for performing cell identification, reference signal received power (RSRP) and reference signal received quality (RSRQ) measurements. The RSRP measurement definition based on discovery signals is the same as in prior releases of LTE. The RSSI measurement is defined as an average over all OFDM symbols in the downlink parts of the measured subframes within a DRS occasion. The RSRQ is then defined asDRSRQ=N×DRSRP/DRSSI, 
where N is the number of PRBs used in performing the measurement, DRSRP is the RSRP measurement based on the discovery signals and DRSSI is the RSSI measured over the DRS occasion.
In Rel-12, RSRP measurements based on the CRS and CSI-RS in the DRS occasions and RSRQ measurements based on the CRS in the DRS occasions have been defined. As stated earlier, discovery signals can be used in a small cell deployment where the cells are being turned off and on or in a general deployment where the on/off feature is not being used. For instance, discovery signals could be used to make RSRP measurements on different CSI-RS configurations in the DRS occasion being used within a cell, which enables the detection of different transmission points in a shared cell.
When measurements are made on the CSI-RS in a DRS occasion, the UE restricts its measurements to a list of candidates sent to the UE by the network via RRC signaling. Each candidate in this list contains a physical cell ID (PCID), a virtual cell ID (VCID) and a subframe offset indicating the duration (in number of subframes) between the subframe where the UE receives the CSI-RS and the subframe carrying the SSS. This information allows the UE to limit its search. The UE correlates to the received signal candidates indicated by the radio resource control (RRC) signal and reports back any CSI-RS RSRP values that have been found to meet some reporting criterion, e.g., exceeding a threshold value.
When a UE is being served on multiple carrier frequencies via a PCell and one or more SCells, the UE needs to perform radio resource management (RRM) measurements on other cells on the currently used carrier frequencies (intra-frequency measurements) as well as on cells on other carrier frequencies (inter-frequency measurements). Since the discovery signals are not transmitted continuously, the UE needs to be informed about the timing of the discovery signals so as to manage its search complexity. Furthermore, when a UE is being served on as many carrier frequencies as it is capable of supporting and inter-frequency RRM measurements need to be performed on a different carrier frequency that is not currently being used, the UE is assigned a measurement gap pattern. This gap pattern on a serving frequency allows the UE to retune its receiver for that frequency to the other frequency on which measurements are being performed. During this gap duration, the UE cannot be scheduled by the eNB on the current serving frequency. Knowledge of the timing of the discovery signals is especially important when the use of such measurement gaps is needed. Beyond mitigating UE complexity, this also ensures that the UE is not unavailable for scheduling for prolonged periods of time on the current serving frequencies (PCell or SCell).
The provision of such timing information is done via a discovery measurement timing configuration (DMTC) that is signaled to the UE. The DMTC provides a window with a duration of 6 ms occurring with a certain periodicity and timing within which the UE may expect to receive discovery signals. The duration of 6 ms is the same as the measurement gap duration as defined currently in LTE and allows the measurement procedures at the UE for discovery signals to be harmonized regardless of the need for measurement gaps. Only one DMTC is provided per carrier frequency including the current serving frequencies. The UE can expect that the network will transmit discovery signals so that all cells that are intended to be discoverable on a carrier frequency transmit discovery signals within the DMTCs. Furthermore, when measurement gaps are needed, it is expected that the network will ensure sufficient overlap between the configured DMTCs and measurement gaps.
Turning to Wireless Local Area Networks, in typical deployments of WLAN, carrier sense multiple access with collision avoidance (CSMA/CA) is used for medium access. This means that the channel is sensed to perform a clear channel assessment (CCA), and a transmission is initiated only if the channel is declared as Idle. In case the channel is declared as Busy, the transmission is essentially deferred until the channel is deemed to be Idle. When the range of several APs using the same frequency overlap, this means that all transmissions related to one AP might be deferred in case a transmission on the same frequency to or from another AP which is within range can be detected. Effectively, this means that if several APs are within range, they will have to share the channel in time, and the throughput for the individual APs may be severely degraded. A general illustration of the listen before talk (LBT) mechanism is shown in FIG. 6.
There are several versions of LBT thus far classified in Release 13. These are                1. Category 1: No LBT                    No LBT procedure is performed by the transmitting entity.                        2. Category 2: LBT without random back-off                    The duration of time that the channel is sensed to be idle before the transmitting entity transmits is deterministic.                        3. Category 3: LBT with random back-off with a contention window of fixed size                    The LBT procedure has the following procedure as one of its components. The transmitting entity draws a random number N within a contention window. The size of the contention window is specified by the minimum and maximum value of N. The size of the contention window is fixed. The random number N is used in the LBT procedure to determine the duration of time that the channel is sensed to be idle before the transmitting entity transmits on the channel.                        4. Category 4: LBT with random back-off with a contention window of variable size                    The LBT procedure has the following as one of its components. The transmitting entity draws a random number N within a contention window. The size of contention window is specified by the minimum and maximum value of N. The transmitting entity can vary the size of the contention window when drawing the random number N. The random number N is used in the LBT procedure to determine the duration of time that the channel is sensed to be idle before the transmitting entity transmits on the channel.                        
Regarding Carrier Selection, as there is a large available bandwidth of unlicensed spectrum, carrier selection is required for LAA nodes to select the carriers with low interference and with that achieve good co-existence with other unlicensed spectrum deployments. For any technology, when deploying an additional node, the first rule for achieving high-performance for the new node itself as well as for the existing nodes is to scan the available channels and select one that would receive least interference for the node itself and cause least interference to existing nodes.
The basic principle behind carrier selection is for the eNB to scan and sense channels for interference or radar detection, and configure the SCell frequency accordingly based on the outcome of its carrier selection algorithm. The carrier selection process is separate and on a different time scale from the LBT/CCA procedure prior to transmissions on the unlicensed channels. It is expensive to move all attached UEs to another carrier frequency due to the signaling required and interruptions in the data flow.
Autonomous, semi-static carrier selection can be based on the eNB sensing of the averaged interference level, potential presence of radar signals if required, and traffic load on the candidate carriers over a relatively longer time scale. Once a suitable set of carriers is identified, they are added and activated as SCells for UEs. This process may be repeated periodically over tens or hundreds of milliseconds in order to keep reassessing the interference environment, and the associated measurements do not need any new specifications. Once a set of carriers is activated after the carrier selection process, transmissions can be performed dynamically on one or more of them based on LBT and fast DTX.
For Licensed assisted access (LAA) to unlicensed spectrum using LTE, up to now, the spectrum used by LTE is dedicated to LTE. This has the advantage that LTE system does not need to care about the coexistence issue and the spectrum efficiency can be maximized. However, the spectrum allocated to LTE is limited which cannot meet the ever increasing demand for larger throughput from applications/services. Therefore, a new study item has been initiated in 3GPP on extending LTE to exploit unlicensed spectrum in addition to licensed spectrum. Unlicensed spectrum can, by definition, be simultaneously used by multiple different technologies. Therefore, LTE needs to consider the coexistence issue with other systems such as IEEE 802.11 (Wi-Fi). Operating LTE in the same manner in unlicensed spectrum as in licensed spectrum can seriously degrade the performance of Wi-Fi as Wi-Fi will not transmit once it detects the channel is occupied.
Furthermore, one way to utilize the unlicensed spectrum reliably is to transmit essential control signals and channels on a licensed carrier. That is, as shown in FIG. 7, a UE is connected to a Primary Cell, PCell, in the licensed band and one or more Secondary Cells, SCells, in the unlicensed band. In this application we denote a secondary cell in unlicensed spectrum as license assisted secondary cell (LA SCell).
Recently there have also been proposals to operate LTE in unlicensed spectrum without the aid of a licensed carrier. In such a standalone operation in unlicensed spectrum using LTE, the PCell will also operate on the unlicensed carrier and thus essential control signals and channels will also be subject to unmanaged interference and LBT.
LTE mobility, i.e. to maintain a connection while the UE is moving between different network nodes, is typically done on the PCell. When the PCell is operating in unlicensed spectrum the signals used for mobility (typically PSS/SSS and CRS) are typically transmitted rather sparsely, e.g. in the DRS occasion. In addition they are all subject to LBT and thus their presence is not guaranteed.
Further the rather dense system information broadcast messages that are typically transmitted on the PCell will also need to be transmitted more sparsely and under LBT constraints.
Network synchronization refers to the degree of time- and frequency synchronization the network nodes have. The degree of synchronization typically varies from:                Tight, enough for advanced transmission techniques, which in today's LTE system is on μs level        Coarse synchronization, enough for aligning e.g. DRS occasions with DMTC windows and measurement gaps, typically on ms level        No synchronization        
Using a fixed size DMTC window of 6 ms is not suitable either for systems that need to perform clear channel assessment before transmitting or networks with varying degree of eNB time (or frequency) synchronization.
For systems subject to LBT, the expected delay to access the channel depends on the interference level from other nodes, thus using a fixed window not suitable. If the window is set large enough to cater for the worst possible delay, UE power consumption will suffer because the UE is required to look for neighbor cells throughout the window. On the other hand, if the window is set too small, the UE might fail to detect some neighbor cells, potentially leading to bad mobility performance.
For systems with varying degree of synchronization, the DMTC window need to be set to guarantee that all neighbor cells DRS transmissions fall into the window. For a network (NW) without any synchronization this would imply that the DMTC window size would need to be equal to the DRS period. If the DMTC window size is fixed, a UE in a network with at least some degree of synchronization would not be able to benefit because the window size would need to be set based on the worst case (no synchronization).
Therefore, there is a need for methods, and arrangements for configuring the timing without suffering from the problems mentioned above.