The present invention generally relates to the reduction or removal of packet jitter and latency in Time Division Multiple Access (TDMA) telecommunication systems including Multi-Frequency TDMA (MF-TDMA) systems.
Time Division Multiple Access (TDMA) is a channel access method for shared medium telecommunication networks. It allows several user terminals to share the same frequency channel by dividing the signal into different time slots. The user terminals transmit in rapid succession, one after the other, each using its own time slot. This allows multiple stations to share the same transmission medium (e.g., radio frequency channel) while using only a part of its channel capacity. TDMA is used in the digital 2G cellular systems such as Global System for Mobile Communications (GSM), IS-236, Personal Digital Cellular (PDC) and iDEN, and in the Digital Enhanced Cordless Telecommunications (DECT) standard for portable phones. It is also used extensively in satellite systems as well as combat-net radio systems and PON networks for upstream traffic from premises to the operator. For satellite networks, MF-TDMA is the dominant technology because it provides the most bandwidth and the greatest overall efficiency and service quality, while also allowing the dynamic sharing of that bandwidth among many (tens of thousands) of transmitters in a two-way communication mode.
Unlike a Single Carrier Per Channel (SCPC) system, a system employing TDMA is more prone to producing packet jitter. In an SCPC system user packets can be transmitted as soon as they arrive for transmission whereas in a TDMA system there could be a variable temporal gap between when packets arrive and when time slots are allocated for the terminal to send those packets. The variable gaps produced manifest themselves as packet jitter. In addition to packet jitter, the queuing of packets for delayed transmission in allocated slots and the queuing of packets in buffers used in conventional methods of reducing jitter can add significantly to packet latency across the TDMA link. It is important to maintain good service quality in a telecommunications network, and control and reduction of jitter and latency is a significant part of that. Jitter and latency measurements are often used in Mean Opinion Score (MOS) algorithms as a means to provide metrics for voice quality in voice applications.
Conventional jitter buffers remove packet jitter by outputting packets based on the original packet timing, for example, as delivered from an upper layer (e.g., the application layer). For example, a conventional jitter buffer, at a receiver end of a communications channel, will receive packets at some packet interval. The packet interval may not be regular and may not match the original interval as received at the transmitter end. The difference in timing between the packet interval with respect to the delivery of packets to a transmitter at the transmission and of a communications channel and the packet interval with respect to the receipt of those packets as transmitted over the communications channel may be caused by several factors, such as allocated slot assignments within a TDMA frame and transmission delays over the communications channel. The jitter buffer at the receiver generally releases the received packets at a regular packet interval. The jitter buffer must therefore retain each packet for a period of time to achieve that regular interval, and must queue enough packets to ensure a packet will always be available for output. In a conventional TDMA system, both the wait time for transmit opportunities, and the buffer retention time necessary for jitter removal by a conventional jitter buffer, add up to the introduction of a considerable amount of packet latency into the link. Further, the packet latency is exacerbated in communications systems that exhibit relatively long transmission times (e.g., satellite communications systems where the round trip satellite delay can reflect a major contributor to latency). As a result, such added packet latency adversely affects the service quality experienced by quality of service (QoS) sensitive applications (e.g., voice applications, such as Voice over Internet Protocol (VOIP)).
What is needed, therefore, is an approach for mitigating packet jitter and latency, ensuring high quality QoS, for jitter and latency sensitive applications over TDMA (including MF-TDMA) systems.