Nuclear Magnetic Resonance (NMR) tools used for well-logging or downhole fluid characterization measure the response of nuclear spins in formation fluids to applied magnetic fields. Downhole NMR tools typically have a permanent magnet that produces a static magnetic field at a desired test location (e.g., where the fluid is located). The static magnetic field produces a non-equilibrium magnetization in the fluid. The magnetization is aligned along the direction of the static field. The magnitude of the induced magnetization is proportional to the magnitude of the static field. The proportionality constant is the static magnetic susceptibility. A transmitter antenna produces a time-dependent radio frequency magnetic field that is perpendicular to the direction of the static field. The NMR resonance condition is satisfied when the radio frequency is equal to the Larmor frequency, which is proportional to the magnitude of the static magnetic field. The radio frequency magnetic field produces a torque on the magnetization vector that causes it to rotate about the axis of the applied radio frequency field. The rotation results in the magnetization vector developing a component perpendicular to the direction of the static magnetic field. This causes the magnetization vector to precess around the static field at the Larmor frequency. At resonance between the Larmor and transmitter frequencies, the magnetization is tipped to the transverse plane (i.e., a plane normal to static magnetic field vector). A series of radio frequency pulses are applied to generate spin echoes that are measured with the antenna.
The resonance condition requires that the transmitter radio frequency equal the Larmor frequency. Deviation between the two frequencies can lead to inaccurate estimation of porosity, in the case of logging tools, and hydrogen index, in the case of fluid sampling tools. In addition, the deviation can lead to systematic errors in the estimation of relaxation time distributions, thereby resulting in inaccurate estimates of, for example, viscosity, permeability, pore size distribution, irreducible water saturation, etc.
The Larmor frequency when operating in downhole conditions differs from that in the laboratory. This difference is caused by the temperature variation of the magnetization, and in some cases, by the accumulation of magnetic debris in the vicinity of the permanent magnet. Magnetic debris is frequently found in the drilling mud due to the drill string scraping metal particles from well casing while tripping in and out of the hole during drilling operations. The effect of magnetic debris on the Larmor frequency depends on the quantity and distribution of the debris. As a result, the Larmor frequency needs to be determined accurately for downhole conditions. Two methods are currently known for in-situ estimation of the Larmor frequency.
One method, the Larmor Frequency Search Task method, was developed by Freedman, et al and is described in U.S. Pat. No. 5,457,873. In this method, an initial estimate of the Larmor frequency is made based on the temperature of the tool. A series of NMR measurements are made at different operating frequencies. A predetermined response curve is fitted to the measurements to determine the frequency at which the maximum echo amplitude is obtained. The implementation of the method is illustrated in FIG. 1. The figure shows a plot of the initial echo amplitude calculated from the mean of the first ten echoes for a range of transmitter frequencies. The maximum amplitude is observed at 2.270 MHz, which corresponds to an independent measurement of the Larmor frequency using a Hall probe. However, the implementation of this process is time consuming. Additionally, the process requires that the formation porosity remain constant during the measurements, which may not be true if measurements are performed while the tool is moving. Moreover, in low porosity formations, it is not possible to accurately determine the Larmor frequency because of low signal-to-noise ratio (SNR).
A second method, the Echo Phase method, was developed by Bordon, et al and is described in U.S. Pat. No. 7,026,814. If the Larmor and transmitter frequencies are different, the phase of the echo signal changes along the echo interval with respect to the reference radio frequency phase. The difference between the instantaneous phases of the echo signal at two time intervals is linearly related to the deviation between the Larmor and transmitter frequencies, provided the deviation is small. However, the implementation of this method requires a detailed calibration of the electronics (e.g., the phase shifts due to temperature need to be recorded). In addition, the phase difference is influenced to a large extent by antenna tuning (deviation of the resonance frequency of the antenna from the transmitter frequency). Therefore, the detuning of the antenna and its effect on the phase difference needs to be calibrated.