In modern applications such as radar detection and location, digital receivers are used to detect and characterize (e.g., determine a frequency of) electromagnetic signals (pulses). Such receivers are useful for various purposes including, for example, commercial communications, surveillance, and modern warfare applications such as threat detection, and/or threat location. Digital receivers monitor electromagnetic energy to detect and characterize potential threat signals (e.g., enemy radar). These signals are obscured by noise that is internally generated in the receiver as well as noise from external sources.
The signal to noise ratio (SNR) is the ratio of the power of the signal of interest (SOI) relative to the power of the noise (or unwanted signal). SNR is typically measured in terms of decibels (dB). When the SOI is more powerful than the noise, the SNR is described as a positive number of decibels. When the noise is more powerful than the SOI, the SNR is described as a negative number of decibels.
When the SNR is lower than the pulse detection sensitivity limits of a digital receiver, the digital receiver becomes ineffective because potential threat signals can be missed and/or false threats may be detected. Additionally, even when a potential threat signal is detected, the SNR may be lower than the characterization sensitivity limits of the digital receiver, thereby preventing accurate characterization of the threat signal. As a result, systems with undesirable SNR can have poor sensitivity and either fail to detect threats or, if the threat is detected, they may characterize the threat improperly. Furthermore, processes for improving SNR to increase sensitivity often call for complex filters and/or calculations that can result in increased cost and/or long latency.