Among all logging tools deployed in the wellbore, resistivity tools provide the largest depth of detection. As a result, they have been extensively used for detecting formation layer boundaries in applications such as landing or well placement. Moreover, such logging tools are utilized to acquire various other characteristics of earth formations traversed by the wellbore and data relating to the size and configuration of the wellbore itself. The collection of information relating to downhole conditions, commonly referred to as “logging,” can be performed by several methods including wireline logging and “logging while drilling” (“LWD”).
The depth of detection provided by the logging tool is directly proportional to the distance between the transmitter and the receiver. As a result, most of the deep reading tools have very large distance between them. For example, some deep resistivity reading tools can be as long as 50-100 feet, and they operate at frequencies lower than 8 KHz to compensate for the geometrically increasing attenuation at larger transmitter receiver separations. In contrast, the standard, shallower, tools have a range of about 20 feet and they are optimized for placement of wells in reservoirs within about 10 feet from the top or bottom boundary of the reservoir rock.
The required distances between the transmitters and receivers along deep reading tools create problems in calibration since most of the conventional calibration methods (air hang, test tank, or oven, for example) require a certain stand-off from any nearby objects that might interfere with the calibration measurement signals. As a result, it is impractical to apply these conventional calibration techniques to a deep reading resistivity tool since the tool's sensitive volume is too large and, thus, it is not feasible to have facilities big enough to fully contain the tools.
Accordingly, there is a need in the art for a practical technique in which to calibrate a deep reading resistivity logging tool.