The use of satellites for the transmission of voice and data communication signals has greatly expanded over the past decade, and satellite links are now routinely used and relied upon for global, almost instant, bi-directional communications. Rapid improvements in satellite communications have produced beneficial impacts on many segments of society from business and commercial applications to consumer products. For example, residences throughout the world now receive news and entertainment broadcasts via satellite, in nearly real-time from sites in almost any country.
Burst-mode communication techniques and the related synchronization techniques are frequently used in satellite voice and data transmissions. These systems typically employ multiple transmitters that send “bursts” of transmissions to a receiver. Bursts from the different transmitters are coordinated in time and frequency such that each transmitter can communicate with the receiver without interfering with each other. In one type of burst signal communication, Time Division Multiple Access (TDMA) allows multiple users to share a single carrier wave using time-division multiplexing (TDM) to transmit multiple signals on that carrier wave. TDM divides the carrier wave into time slots and then allocates those time slots to the different data signals. Effectively, each of the data signals takes turns accessing the carrier wave, thereby allowing a single carrier wave to carry multiple simultaneous data transmission. A Multi-Frequency Time Division Multiple Access (MF-TDMA) receiver simultaneously receives TDMA signals on several different carrier frequencies. In the MF-TDMA data transmission scheme, any user can potentially transmit data at any frequency at any time. The actual time slot and frequency allocation to each user is based on capacity requests submitted by the terminal.
Within the TDMA bursts specified in DVB-RCS Standard, the carrier signal is modulated by data symbols in which a phase characteristic of the symbol represents the data. This type of modulation technique is known as “phase shift keying” data modulation. In general, each symbol can be represented as a phasor in which the phase state of the symbol at the correct detection instant, or the relative change in phase from symbol to symbol, represents data. This data, in turn, can be expressed as a corresponding bit or combination of bits in which the number of bits corresponds to the number of possible phase states used for data modulation. For example, in binary phase shift keying (BPSK), each symbol may have one of two phase states (i.e., 0, π). Each BPSK symbol can therefore represent a single binary digit (bit). In quadrature phase shift keying (QPSK), each symbol may have one of four phase states (i.e., 0, π/2, π and 3π/2). Each QPSK symbol can therefore represent two binary digits. In the general “M” phase shift keying (MPSK), each symbol may have “m” phase states. Each MPSK symbol can therefore represent “n” binary digits, where M=2n.
Typically, MF-TDMA signals are de-multiplexed and re-arranged to form a signal equivalent to a single carrier. The single-carrier data signal is then demodulated to recover the underlying data. The demultiplexing and demodulation steps are well known in the field of communications and are the subject of extensive research and development to improve transmission speeds, bandwidth, accuracy, and reliability.
Several satellite data transmission standards have been adopted to harmonize the transmission and reception of satellite communications broadcasts. One known standard adopted in the broadcast of Digital Video Broadcast (DVB) signals is the Digital Video Broadcast by Satellite (DVB-S). Standard EN 300 421 of the ETSI (European Telecommunications Standards Institute). This standard relates to DVB services and transparent satellite communication systems to provide DVB-S services directly to the user through an integrated receiver/decoder device that is located in the user's home. The versatility of DVB-S in multiplexing permits the use of a transmission capacity encompassing a variety of television service configurations, including sound and data services.
The components of the DVB-S services are transmitted on a TDM carrier wave. For more information on the DVB-S standard, please refer to ETSI publication EN 300 421 V1.1.2 (1997-98), entitled “Digital Video Broadcasting (DVB); Framing Structure, Channel Coding and Modulation for 11/12 GHz Satellite Services,” the subject matter of which is hereby incorporated herein by reference.
Satellite broadcasts are also increasingly used for instantaneous two-way audio, video, and data communications. Accordingly, recent attention has been given to the demand for making satellite communications interactive so that recipients of the broadcast can also communicate back to the origin of the transmission. For example, satellite communications can be used to provide Internet connections between different users. In an effort to establish unified bi-directional satellite communications, the digital video broadcast with return channel by satellite (DVB-RCS) standard has been enacted by the ETSI.
The DVB-RCS standard relates to interaction channels on a satellite distribution system. The purpose of this standard is to provide basic specifications for providing interaction channels for interactive networks based on geostationary (GEO) satellites that incorporate return channel satellite terminals (RCST). The DVB-RCS standard facilitates the use of RCSTs for domestic installations both individual and collective types. The DVB-RCS standard likewise supports the connection of the terminals with home data networks, and can be applied to all frequency bands allocated to GEO satellite services. For more information on the DVB-RCS standard, refer to ETSI publication, EN 301 790 v.1.3.1, dated 2003-03, entitled “Digital Video Broadcasting (DVB); Interaction Channel for Satellite Distribution Systems,” the subject matter of which is hereby incorporated herein by reference.
Satellite communication systems operating under the DVB-RCS standard can exchange data using a variety of network and Internet technologies. For example, the DVB-RCS standard accommodates Asynchronous Transfer Mode (ATM) technology for transferring data in cells or packets of a fixed size. The data packet used with ATM is relatively small compared to packets used with older technologies. The small, constant packet size allows ATM equipment to transmit video, audio, and computer data over the same network, and helps assure fairness. ATM creates a fixed channel, or route, between two points whenever data transfer begins, unlike TCP/IP that divides messages into packets that can each take a different route from source to destination. The DVB-RCS standard may also be used to transmit MPEGs (Moving Picture Experts Group), a family of digital video compression standards and file formats that achieve high compression rate by storing only the changes from one frame to another, instead of each entire frame.
Loss of bursts (or packets), as measured by packet loss ratio, is a main performance criterion under DVB-RCS. Due to a long propagation delay in geostationary satellite communications, the packet loss ratio should be low in order to avoid performance degradation at higher network layers. The human senses are generally tolerant of slight variations, so for the transmission of video and sound broadcasts, as defined by DVB-RCS, the packet loss ratios is preferably in the order of 1×10−5, so that less than one packet is lost per hundred-thousand burst signals. For more stringent types of data transmissions, a lower packet loss may be needed, such as 1×10−7, where less than 1 packet is lost per ten million packet transmissions.
One way to decrease the packet loss is to increase the signal transmission strength or effective isotropic radiated Power (EIRP) of the transmitter, thereby increasing the signal-to-noise ratio at the receiver. Improvements in the signal-to-noise ratio are desirable because, as provided by Shannon's theorem, the ultimate theoretical limit to the data transmission transfer rate on a communications channel is directly proportional the signal-to-noise ratio of that channel. Consequently, increasing the power transmitted on the return channel can often be a solution to provide adequately reliable communications. However, increasing the transmission power of the receiver unacceptably increases the cost of the receiver. Accordingly, there is a current need for a demodulation technology that allows reliable communications over a low-power burst-mode signal experiencing a low signal-to-noise ratio. More specifically, there is a current need for a demodulation technology that allows for sufficiently low packet loss rates for transmission in a DVB-RCS system while maintaining or even reducing current terminal transmission power levels in order to minimize the cost of user terminals.
Another important aspect of DVB-RCS system application is the provision of services at Ka-band frequencies (e.g., 30 GHz uplink from terminals to satellite). The cost of the user terminal plays a major role in the business model of this type of services. RF components of the terminals are costly. Less expensive RF components result in tighter link budget on the uplink from the terminal to satellite. For this application, the use of very power efficient modems is essential in order to maintain acceptable level of system availability.
Thus, there exist a further need for a demodulating technology that allows for sufficiently low packet loss ratio for DVB-RCS transmissions with a higher power efficiency. As suggested above, numerous technical and physical problems complicate the synchronization in satellite communications. For instance, the synchronization may be difficult where the transmitter and receiver are moving relative to each other. Specifically, when a burst-mode communications transmitter is on or near the earth and the intended receiver is in a satellite (or when a satellite transmits to the terrestrial receiver), the spatial locations and the relative velocities of the transmitter and receiver change over time. The change in spatial location causes the propagation path length and the signal propagation time to change, and the change in relative velocities causes a Doppler frequency to change the frequency of the burst-mode signal when it is received at the intended receiver. As a consequence, the burst-mode signals, originally transmitted at fixed intervals, arrive at varying time intervals. Furthermore, varying weather conditions, such as clouds and rain, also affect the communication signals. There is also certain level of inherent carrier frequency uncertainty at the transmitter output. Overall, these and other conditions cause carrier frequency offset in the burst-mode communications.
These issues are particularly present in DVB-RCS communications. At a DVB-RCS transmitter output, a 30 GHz carrier will generally appear with some carrier frequency offset fo, or residual error, so that carrier frequency (fc)=30 GHz±fo. As suggested above, contributors to the carrier frequency offset include movement of the satellite which creates a satellite Doppler effect, uncertainty due to the satellite's transponder, uncertainty or changes at the transmitter as to the exact carrier frequency, and length and atmospheric conditions en route. These and other contributors in the system deviate the carrier off its nominal value and synchronization is performed to correct for the carrier frequency offset and get the carrier back to its nominal or baseband state. In a sense, synchronization is a fine-tuning value for best receiver performance.
Accordingly, the synchronization process generally includes signal detection, finding the right timing (i.e., the symbol timing), finding the carrier frequency offset, and tracking the carrier phase. After the right combination of these factors is determined, thereby enabling synchronization, the data signal is demodulated from the carrier wave and the encoded symbols represented by the data signal is passed to a decoder to extract and deliver the payload data in the burst.
In synchronous digital transmission, information is conveyed by uniformly spaced pulses and the function of any receiver is to isolate these pulses as accurately as possible. However, the received signal has undergone changes during transmission due to the noisy nature of the transmission channel, resulting in signal distortion, such as carrier frequency offset and phase noise. Complete estimation and removal of these sources of signal distortion is necessary prior to data detection. A proper synchronization of a burst signal is needed to decode the transmitted data, and this process typically involves the identification, estimation and removal of these sources of signal distortion.
Conventional synchronization methods usually operate at relatively high signal-to-noise ratio that allow reliable synchronization. In addition, conventional burst signal demodulation techniques perform synchronization and then the decoding in a serial manner. For this reason, a receiver generally includes a cascade of receiving filter, synchronization, and decoding to process received burst signals. Since the return channel is operating at a very low signal-to-noise ratio, carrier synchronization using this traditional approach cannot alone give the right signal constellation. The currently known techniques for improving synchronization estimations generally rely on certain threshold criteria, such as having a sufficiently high signal-to-noise ratio, to determine the points where the synchronization problem occurs. When falling below the threshold value, the conventional algorithms fail to operate properly. Much of the current research and development is directed toward developing techniques to reduce this threshold level.
Theoretical studies can be conducted on existing estimators to determine their fundamental performance level, or their baseline, and this type of measurement may give some sort of theoretical bound for the performance of known techniques and algorithms used for parameter estimation. The theoretical limits to the performance of existing estimator designs (i.e., the performance under ideal situations) are inadequate to meet the needs of a faint burst-mode signals transmitted at a low power level, such as a return link channel in a DVB-RCS system.
As a result, there is an on-going need for techniques and systems for detecting and correcting for signal distortion, such as the carrier frequency offset and phase noise, that typically affect a burst-mode satellite communication system experiencing a low signal-to-noise ratio.