The format of a transmitted burst signal typically consists of a unique word (UW) followed by the user data. The length of the UW is determined by the desired acquisition probability and signal-to-noise ratio (SNR) of the received signal. A burst signal is said to be detected if the UW is detected. This is accomplished by correlating the received signal with the reference UW.
In a synchronous code division multiple access (CDMA) system, the received signal is typically represented as r=Sd+n where d is the data source, S is the modulation and n is the noise. In the absence of a signal, the receiver observes the noise component n. The noise n is composed of many component signals.
Typically, the detection of a signal of interest is usually done by correlating the incoming signal r with a one of a set of known bit sequences that are preambles to the user message. If the magnitude of the correlation of the signal r with the unique word is large enough, a burst detection is said to have occurred. The threshold above which a correlation magnitude is declared “large enough” must therefore be established. If the threshold is too low, large numbers of false detections occur causing congestion in the receiver. If the threshold is too high, otherwise valid messages are missed.