This invention relates to detecting transitions, e.g., phase transitions imparted to a carrier tone or tones as part of a training signal sent from one modem to another.
A typical training sequence begins with the transmission of tones used by the receiver to detect the presence of signal, to set up the analog receiver gain, and to learn any clock or carrier frequency offset, among other things. The next part of the training sequence is a pseudo-random two-phase or four-phase symbol sequence. This sequence is known to the receiver and is used to set up its equalizer in a reference-directed fashion. In order to tell the receiving modem precisely when this part of the training sequence will begin, the transmitting modem precedes it by a 180 degree phase reversal (transition) in the initially transmitted tones. Often the transition occurs a number of symbol intervals before the training sequence is to appear, to provide a period following the transition that is long enough to allow reliable detection of exactly when the transition occurred. Measuring the precise timing of the transition is also an important step in setting up echo cancelers in modems of the type that handle full duplex communication over the two-wire switched telephone network, as in the CCITT standard V.32. Setting up the echo cancelers requires determining the round-trip propagation delay, which is done using a hand-shake procedure based on phase transitions imposed on tones sent back and forth over the channel.
In detecting the time when the transition occurs, the receiving modem must be able to distinguish the intentionally imposed transition from transient disturbances such as phase hits, gain hits, impulse hits, etc., introduced by the telephone channel, which may have an appearance similar to the intentional transition.
In one transition detection scheme, a decision on the transition timing is made as soon as the received tones give the appearance that a transition has occurred. Such decisions are highly susceptible to errors caused by transient disturbances.
A theoretically optimum transition detector (for channels affected only by white Gaussian noise) is one which delays the decision on when the transition occurred as much as possible in order to find the one transition position where the tones match the received signal most closely. Because of the decision delay, such a detector will be more robust in the presence of transient disturbances.
In one implementation of the optimum transition detector, a correlator can be used to choose, as the decision, the one possible transition position which gives the highest correlation between the tones and the received signal. The correlation principle was used by Walsh in "Synchronization of a Data Communication Receiver with a Received Signal," U.S. Pat. 4,290,139, issued Sept. 15, 1981, which discloses a correlator to detect the presence of an expected segment (e.g., a transition) in a received signal. His scheme uses a finite impulse response (FIR) digital filter whose coefficients are chosen to be representative of the filter input during the expected segment. The detection decision is made based on the filter output, which is anticipated to be large when the expected segment is present. A similar scheme is disclosed by Miller in "Modern Signal Acquisition Technique," U.S. Pat. 4,462,108, issued July 24, 1984.