Modern wireless communications networks include many different network topologies comprising heterogeneous mixtures of macrocell, microcell, picocell, and femtocell resources. At the highest level of wireless coverage, a macrocell provides cellular service for a relatively large physical area, often in areas where network traffic densities are low. In more dense traffic areas, a macrocell may act as an overarching service provider, primarily responsible for providing continuity for service area gaps between smaller network cells. In areas of increased traffic density, microcells are often utilized to add network capacity and to improve signal quality for smaller physical areas where increased bandwidth is required. Numerous picocells and femtocells generally add to network capacity for even smaller physical areas in highly populated metropolitan and residential regions of a larger data communications network.
As would be understood by those skilled in the Art, in all wireless service provider networks, macrocells typically provide the largest wireless coverage area for licensed frequency spectra, followed by microcells, then picocells, and lastly femtocells. By way of example, in a typical wireless data communications network, a macrocell base station may provide a wireless coverage area ranging between one to five kilometers, radially from the center of the cell; a microcell base station may provide a coverage area ranging between one-half to one kilometer radially; a picocell base station may provide a coverage area ranging between 100 to 500 meters radially; and a femtocell base station may provide a coverage area of less than 100 meters radially. Each of these network cell or base station types is generally configured to connect with a particular service provider network using various common wireline communications technologies, including, but not limited to: fiber optic, twisted pair, powerline, and/or coaxial cable (joining cells to a backhaul network).
Macrocell and microcell network deployments are typically designed and orchestrated by radio communications engineers and scientists whom model idealized radio propagation (including various path loss considerations) and frequency planning scenarios during a network planning phase, in order to provide optimal cell provisioning for various network resources. Computer modeling is often employed to determine frequency spectrum allocation for network cells, including frequency reuse assignment, and radio operation level (e.g., power levels and/or modulation and coding schemes) assignments for various network base stations.
These modeling operations often attempt to simulate radio frequency isolation contributors by using a variety of theoretical path loss models. These idealized models require relatively conservative estimates of inter-access node isolation in order to minimize the possibility of co-channel interference (CCI) between neighboring base stations in a wireless network. These digital tools fail to accurately predict/forecast challenging radio frequency propagation environments (e.g., most real world metropolitan environments) including interference between overarching macrocells and structurally contained picocells and/or femtocells (e.g., cells contained within office buildings or residential housing). These modern planning utilities are also deficient (by being overly conservative) in accurately estimating time-varying radio frequency isolation contributors, such as changing seasonal foliage, time-varying regional vehicular traffic patterns, radio access node power control in response to access node utilization, etc.
In order to reduce the possibility of CCI amongst network cells re-using the same frequency spectra, conservative network planning and resource optimization processes often result in unnecessarily reduced radio operation levels at network base stations. These overly limited resource operations can unduly waste network capacity by improperly constraining network resource utilization. As would be understood by those skilled in the Art, co-channel interference or CCI generally refers to interference caused by multiple network base stations operating on the same frequency within a region of a wireless communications network. In many cellular communications networks (e.g., in LTE, GSM, and UMTS networks), frequency spectrum is a scarce resource that is divided into non-overlapping spectrum bands that may be assigned to different network cells in accordance with specific frequency planning methodologies. Generally, frequency planning limitations require frequency channels to be re-used, such that the same frequency spectrum bands or channels are re-assigned amongst neighboring network cells in a specific order. In scenarios with significant CCI, users located at the periphery of interfering cells often experience diminished service capacity, dropped communications, and frequent handoffs.
In many existing cellular networks, service providers utilize mobile network resource testing vehicles to periodically gather information to help them manually compensate for the effects of real-world radio frequency isolation contributors and neighboring interference sources. Unfortunately, these mobile testing solutions require manual operation as well as manual radio operating parameter adjustment at network resource sites. These solutions are also expensive to routinely employ, and they are too infrequently utilized to keep up with dynamically changing radio isolation and interference sources. Accordingly, existing theoretical modeling and manual testing/compensation techniques are inadequate solutions for effectively determining and neutralizing many of the negative effects associated with dynamically changing network environments, which are becoming more and more complex with the rapid deployment of an increasing numbers of smaller network cells in evolving wireless communications networks (e.g., with the evolution of 4G communications networks).
These new deployment topologies may result in robust mixtures of network cell coverages within regions of overlapping wireless service. In particular, many modern, low power base stations (e.g., picocell and femtocell devices base stations) are readily transportable within an existing communications network by end users. This mobility can create a situation where many smaller cell base stations may be moved to unpredictable locations within a network where their operation could potentially produce substantial interference to surrounding network infrastructure, unless their maximum radio power levels were constrained to reduce unwanted instances of network interference. These ad-hoc cell deployments are difficult to model, because end users often do not register their devices' new locations with their local service providers. As a result, modern mobile network resource optimization solutions are not utilized frequently enough to timely learn of their presence and then compensate for their interfering affect within a particular network cell.
Further, limited samplings of difficult-to-estimate neighbor cell isolation information made by mobile solutions may be improperly utilized to determine dynamic regional radio resource allocations (e.g., such as time-varying allocation of common radio bearer channels between neighboring cells). When local or temporal neighbor cell isolation is accurately determined to be sufficient, radio channels may be reused between cells even when they are geographically close to one another (e.g., in a scenario where smaller cells are structurally contained within a high path loss environment such as a brick building). Similarly, accurate radio isolation determinations can also be used in advanced local optimization algorithms such as automated common channel power control routines that adjust local base station transmit power with the goal of optimizing local coverage, while minimizing interference to neighboring cells. Unfortunately, modern computer path loss modeling techniques are inaccurate and unreliable, and most mobile testing solutions provide inadequate samplings of dynamically changing isolation environments to be efficiently utilized in modern network resource planning and optimization.
Accordingly, it would be helpful to be able to adequately compensate for radio channel isolation contributors that are both time-varying as well as static in nature, because these isolation contributors are often very difficult to accurately estimate/determine using modern computer modeling and mobile testing techniques. It would further be advantageous to have improved systems and methods that could account for radio frequency isolation sources that change over the course of a single day (e.g., hourly traffic patterns) in the line of sight path between neighboring network cells. It would be helpful to be able to utilize existing network resources (e.g., distributed user equipment and neighboring base stations) to account for actual network resource operating conditions, in order to facilitate accurate determinations of network radio frequency isolation characteristics between and amongst various network base stations. It would further be advantageous if these improved solutions enhanced radio access network performance by employing optimized channel assignment algorithms to effectively manage radio resources based on ongoing automated measurements of changing isolation and interference sources in a dynamic network environment. These improved, self-optimizing network utilities would effectively automate processes that were previously largely manual tasks, thereby reducing the level of required human intervention for successful network operations. This would result in operational and/or deployment savings and it would provide for many other performance, quality, and operational benefits. The importance of these benefits would be readily understood by those familiar with the multitude of benefits commonly associated with self-organized network solutions.