The nuclear magnetic resonance (MRS) groundwater detection technology is developed for detecting groundwater directly and non-destructively in the field of geophysics at present. After the groundwater is excited by an alternating magnetic field, a nuclear magnetic resonance signal is generated. The signal is an exponentially decaying sine wave signal, which is also referred to as a free induction decay (FID) signal. The FID signal is usually in a nano-volt (nV) level, which may be detected after thousands or tens of thousands of times amplification. The signal is also apt to be interfered by an electromagnetic noise from the external environment. The interference noises of different test areas are different, and even in the same test area, the electromagnetic interference noises are also different at different times. In an amplification device for a conventional nuclear magnetic resonance detection system, an amplification factor is set only once based on experience before working. However, it will take two hours to complete a test at a test point. The environmental noise changes greatly, especially in an area having strong noise interference. Therefore, the amplifying circuit is apt to be saturated, thus failing to acquire valid data. For this reason, it is difficult to achieve a desired result by using the same amplification factor when testing in different regions or in the same region but at different times. The amplifier having a fixed amplification factor is apt to be saturated, resulting in signal distortion especially in the case of complex environments with serious noise interference. Therefore, in order to prevent the saturation of the amplifier while meeting the FID signal amplification requirements, it is of great significance to design an anti-saturation device for a ground magnetic resonance signal amplifying circuit.