Communications systems and equipment are known and continue to evolve. Many of these systems now have uplink data channels and downlink data channels as well as uplink and downlink control channels. Many of these channels are further distinguished in varying manners. Generally the control channels are used to establish control parameters for the systems and equipment and in the allocation of communication resources among user equipment (UE). The control schemes also consider battery life for UEs and thus are designed to limit the time and extent that each UE is operational.
In an effort to provide as many services to as many users on an as needed basis as possible given a finite resource (spectrum allocation), proposed systems carefully control UE access to almost all of the spectral resources. This approach helps insure that resources are only allocated when a need exists and when such resources serve the intended purpose. For instance in a proposed Long Term Evolution (LTE) system now being developed, a UE only has access to limited channels, e.g., synchronization channel (SCH), broadcast channel(s) (BCH), reference signal (RS), a Random Access Channel (ASYNCH RACH, SYNCH RACH) and a paging channel (PCH) until some form of allocation or grant is provided to the UE by the system infrastructure (ENodeB or scheduler). This generally includes grants or allocations for each use of an uplink control channel or uplink and downlink data channel.
One of the concerns with this allocation approach, is the notion that each allocation requires some system overhead (messages between the UE and scheduler, ACK/NACKSs, etc). System overhead as it grows chips away at system capacity and thus is at odds with the objective of maximizing service availability.