This invention relates to the determination of a characteristic of earth formations penetrated by a borehole, and more particularly to the determination of formation porosity through neutron measurements.
In hydrocarbon exploration and production, it is of prime importance to determine (a) if a given earth formation contains hydrocarbon, (b) the amount of hydrocarbon within the formation, and (c) the producibility of the hydrocarbon in place within the formation. The amount of hydrocarbon present within a formation is a function of the pore space or the “porosity” of the formation. In drilling wells for the production of hydrocarbons, it is desirable to measure the porosity of each prospective hydrocarbon producing formation penetrated by the borehole. It is even more desirable, for economic and prospective reasons to determine the porosity of prospective formations during the actual drilling of the borehole.
Over the past decades, many technologies have been used to measure or estimate formation porosity from a borehole. One of these technologies is based on a system that contains an isotopic source that emits fast neutrons, and an axially spaced detector that responds to the flux of impinging thermal neutrons resulting from the interaction of fast neutrons with nuclei within the borehole and formation in the vicinity of the borehole. The basic concept of this system is predicated on the fact that (a) hydrogen is the most effective moderator of fast neutrons because of its low atomic weight, and (b) most hydrogen found in earth formations is contained in liquid in the pore space of the formation, either as water or as liquid hydrocarbon or gas. The detector is axially spaced from the neutron source such that for a given borehole condition, the count rate recorded by the thermal neutron detector decreases as the volumetric concentration of hydrogen, or porosity increases.
Dual detector neutron porosity systems have been introduced to minimize the effects of the borehole upon the measurement of formation porosity. U.S. Pat. No. 3,483,376 and U.S. Pat. No. 5,767,510 disclose two thermal neutron detectors that are spaced axially at different distances from the source of fast neutrons. The ratio of the responses of the two detectors varies with formation porosity, yet is less sensitive to borehole parameters than the count rate from either two individual detectors. The ratio is therefore the measured parameter used to compute porosity. Historically, this ratio has been formed from the response of the detector closest to the source, or the “near” detector, divided by the response of the detector farthest from the source, or the “far” detector.
However, accuracy problems remain with these two detectors systems since the response of said neutron porosity tool significantly varies with the density of the formation to be logged. Furthermore, this response is also a function of the thermal capture cross section (sigma). Minimizing the sigma response of a neutron porosity measurement is usually achieved by enclosing the neutron detectors in a highly absorbing thermal neutron shield through which only epithermal neutrons may penetrate. In that way, only epithermal neutrons are detected, resulting in very little sigma response.
Careful positioning of detectors with respect to the source can minimize density effects. It has been shown that a minimum in density response occurs at a certain unique distance from the source depending on the source energy (see Scott, H. D., et al., 1994, “Response of a Multidetector Pulsed Neutron Porosity Tool”, paper J, in 35th Annual Logging Symposium Transactions of the Society of Professional Well Log Analysts). Detectors placed at the point of minimum density sensitivity have been shown to have very little density response.
However, this technique has various drawbacks, among which is a strong restriction on the placement of the detectors, which in turns, leads to important mechanical constraints. Furthermore, the far detector still displays significant density sensitivity.