To meet the surge in traffic demand and connectivity, radio technology for communication systems is gradually shifting towards a more flexible utilization of the available frequency spectrum at the network nodes forming the radio access network infrastructure, as well as toward denser deployments of low-powered network nodes with smaller coverage area. In this context, the quality of experience of a user (e.g., in terms of average data rate) can be improved through more flexible and dynamic connections established with the network nodes having the potential and the resources to provide the desired service. Thus, user devices should be connected to network nodes that not necessarily provide the best signal strength but rather have more resource available or, equivalently, less traffic load per frequency resource. To that end, network nodes can be enabled to operate in multiple (not necessarily contiguous) frequency spectrum bandwidths, hereafter referred to as frequency spectrum segments or frequency bands available at a network node.
A frequency spectrum segment is a portion of frequency spectrum band available at a network node. Thus, the available frequency band of a network node is divided into a number of segments, where the size of the segments may be different at different network nodes. For instance, a frequency spectrum segment may comprise a portion or an entire component carrier (as in the 3GPP Long Term Evolution, LTE system) or a portion or the entire frequency bandwidth associated with a radio access technology (RAT) available at a network node. The term available indicates that a frequency spectrum segment is a resource of a network node. Thus, a network node may autonomously determine, or may be configured to use/activate one or more frequency spectrum segments on which it can operate.
In this context, the utilization of frequency spectrum and RAT available at a network node shall be adapted depending on the traffic/service demand, the type of traffic, the interference pattern, as well as the energy cost of operating with a larger portion of frequency spectrum or multiple RATs. In turn, the problem of controlling and making the utilization of spectrum flexible at the network side becomes a problem of associating/connecting user devices to frequency spectrum segment(s), and hence to the corresponding network node(s), that can provide the service desired by the user device, rather than assuring a connection to the network node that offers the best signal strength.
Thus, resource allocation methods for flexible spectrum utilization at the network nodes shall comprise more advanced cell-association and inter-frequency load balancing schemes that adapt the utilization of frequency spectrum at the network nodes so as to comply with users' traffic/service demands and network's energy costs.
In traditional cellular radio systems, user devices access the network by first searching synchronization signals transmitted by network nodes and measuring the strength of the associated reference signals, and then by transmitting an access request to the network node that provides the strongest received signal. A user device already connected to the network, on the other hand, is typically required to monitor the signal strength of multiple network nodes so as to facilitate handover from a serving network node to another network node when the signal strength of the latter becomes better than the signal strength of the former. Either procedure aims at assuring that the user device is always associated or connected to the network node that provides the best signal strength. This, however, does not guarantee the best usage of the network resources nor assures the best service to the users.
For instance, assuming a network node n applies an equal share of the available time-frequency radio resources to the served user devices, the theoretically achievable average user data rate for a user m can be modelled through the Shannon bound as
            r              m        ,        n              =                            W          n                          L          n                    ⁢                        log          2                ⁡                  (                      1            +                          S              ⁢                                                          ⁢              I              ⁢                                                          ⁢              N              ⁢                                                          ⁢                              R                                                      m                    ,                    n                                    ⁢                                                                                                                      )                      ,where Wn and Ln are the frequency spectrum bandwidth and the traffic load (e.g., expressed as the average number of active users served) of access node n, while SINRm,n is the signal to noise plus interference ratio experienced by user m from access node n. It is clear from this equation that a network node n with lower traffic load Ln′<Ln can provide a higher average data throughput despite a lower signal strength (i.e., when SINRm,n′<SINRm,n).
The 3GPP LTE-A Rel.-12 system has partially addressed this issue by investigating mechanisms for balancing the traffic load among network nodes. The purpose of load shifting/balancing is to improve the system performance by changing the traffic load distribution over network nodes either to obtain a more evenly distributed traffic load across the network nodes or to concentrate the traffic into fewer network nodes so as to mitigate inter-cell interference. To that end, it was proposed to achieve load balancing/shifting in the 3GPP system via cell association through one of the following methods:
Cell association based on the strongest reference signal received power (RSRP) in conjunction with a cell association bias;
Cell association based on the strongest reference signal received quality (RSRQ) in conjunction with a cell association bias or threshold;
Cell association based on long-term SINR UE measurements in conjunction with a cell association bias;
Cell association based on a function of UE measurements (RSRP, RSRQ, long-term SINR.) and of network-side information (e.g. cell resource utilizations);
Cell association based on RSRQ or SINR UE measurements within shortened measurement interval.
Another conventional solution is enhanced frequency-domain interference coordination in LTE when two component carriers (i.e., frequency spectrum bands) are available at a network node. In order to reduce the inter-cell interference, the utilization/activation of component carriers should be coordinated among network nodes. To this end, the available component carriers are categorized into secondary cells (Scells) and primary cell (Pcell). Then, in a first step inter-cell interference is reduced by selecting a Pcell for different geographical areas, similarly to frequency reuse schemes cellular systems. In a second step, the secondary Scell is activated at a network node for a specific user device when high data throughput is requested. One criterion to add an additional Scell is, for instance, when the RSRQ of the Scell is higher than a certain threshold.
A drawback of the first conventional solution is that it aims at equalizing the load among network nodes without taking into account how the load could be distributed in relation to the available frequency spectrum or radio access technologies available at each network node. An additional drawback of the first conventional solution is that the cell association criterion used for balancing the user distribution among network nodes does not take into account the traffic demand of the user. In other words, balancing the user distribution does not assures per se a fair distribution of the data traffic, e.g. users with very high traffic demand (and hence needing more time-frequency resources) may end up being associated with the same network node whilst it would be more beneficial to distributed them among multiple network nodes. Thirdly, the cell association criteria used to distribute the users among network nodes only accounts for signal quality measures at the user devices, which is not per se an indication of the service (e.g., data throughput) that can be offered to the user device by a network node.
A drawback of the second conventional solution is that it is designed for two component carriers and assumes a static allocation of the primary component carrier for network nodes (i.e., the Pcell), thus requiring a careful cell planning at the deployment stage. In practice, more than two component carriers may be made available at the network nodes and their utilization should be not constrained.