Conventional methods of measuring and reporting hookload has not changed significantly since 1926. One significant change in this field was the shift from using a diaphragm-type weight indicator to a strain cell integrated with the load bearing pins. There have been some incremental improvements over time, but no significant changes since that time. One problem with the conventional strain measurements is the long communication path of the strain measurement before conversion to engineering units. Each component in the path creates a new source for noise and a new possible entry point for error until the data is in engineering units. FIG. 1 is a block diagram illustrating conventional paths and processing steps for the hookload signal from the measurement to the control system according to the prior art. In particular, FIG. 1 illustrates two processing paths 140 and 150. Both of the paths 140 and 150 include similar equipment (sensor 122, converter 124, barrier 126, barrier 128, converter 130, scaler 132, and controller 134) arranged differently in a pin 110, derrick cabinet(s) 112, and control cabinet 114. Each of the equipment is arranged to convert the measured strain into a final engineering unit measurement (such as kips).
FIG. 1 illustrates different process flow of the signal for the hookload measurement derived from transducer to HMI display. The strain gauge in the pin outputs a millivolt signal that is accepted in to an intrinsically safe barrier. The barrier then converts this millivolt signal to a milliamp signal, then drives the signal from the field station (e.g., junction box) to a control cabinet. Inside the control cabinet, a control system will either accept the current signal or, in some cases, present this signal to another intrinsically safe barrier that converts the signal into a 0 to 10 Volt signal, terminated into an analog input on the control system I/O device. This signal now represents the minimum and maximum output of the original strain gage. The PLC is given instruction to place a “real world” value to measure load commonly represented in Tons or kips. An analog/digital (A/D) converter in the PLC assigns a value to present data to the user based on an formula, which takes a known min and max value and creates a slope based on the difference between the delta. In short, a real world value is displayed to the user based on the amount of force applied to strain gauge.
Conventional hook load measurement is currently derived from a few different methods, including: load cells installed in pins connecting the topdrive to the travelling block, load cells installed in pins on the crown block, load cells installed in the deadline, and Strain measurement sensors installed on the Steel Wire Rope (SWR). The first three methods involve property of the drilling contractor, whereas the fourth is installed by a third party mud logging service provider. The attractiveness to the fourth type of installation is that it does not depend on any rig-based instrumentation and can easily be installed without taking the block out of service. The downside is it is susceptible to breakage, dampening, and its accuracy is debatable.
The third method, in which the load measurement is from the deadline, has been a conventional method consisting of a stepdown piston and hydraulic hose connected directly to a mechanical gauge or to a pressure sensor that will convert to an electrical signal to be display. The strengths to this method include simplicity, ease of access and is simple to troubleshoot. The downsides are inherent dampening, lag, and overall accuracy concerns in the measurement as it is located very far away from the measurement point. Variations in WOB and HL can directly influence the control process as well as the drilling process.
The second method is an improvement on the third by placing the measurement location much closer to what is intended to be measured and removes the problems of a hydraulic circuit and pressure transducer by using a strain gauge sensor. One or more strain gauge sensors are located in each load bearing clevis pin required to lock the crown block in to its position. One conventional installation includes four load pins providing four load measurements. For an accurate measurement in a marine environment all four sensors need to be operational as the load distribution across the four pins is not expected to be homogenous.
During the manufacturing process, the strain gauge load cell is exposed to a full range of its intended loads on a hydraulic press. Also, incorporated into this press is a calibration load cell that is traceable back to NIST (National Institute of Standards and Technology). A calibration certification would accompany a load cell with two or more (typically around ten) calibration value pairs. As the strain gauge load cell does not natively output mA (current loop), a specialized signal conditioner (e.g. KFD2-WAC-Vx1d) is required. The strain measurement is accomplished by supplying an excitation voltage across two points on the Wheatstone bridge and then measuring the resultant voltage on the other side. The signal native to the strain cell is proportional to the excitation voltage and that variation of the measurement section's resistance. The signal units as a result are mV/V. The signal in this form cannot be used directly by a control system. The signal conditioner mentioned converts the mV/V measurement to a current loop signal (4-20 mA). This resultant signal can be used by the control system, however in order to use this signal and the factory calibration the strain gauge cell and the signal conditioner must always be connected and paired with the specific load cell in the circuit. The signal conditioner has ‘zero’ and ‘span’ adjustments (potentiometers or digitally configured), if these are adjusted in the field or a different conditioner is used it invalidates the factory calibration.
A deficiency sometimes seen in the industry during the installation process is that once the load cells are installed in the field, another effort of deriving the same coefficients is done but with roughly estimated loads. To accomplish this in-field calibration the field engineer would request the rig crew to apply the maximum load as possible to the hook. The load applied is approximate (unless a reference cell is available on board), normally the full range of the load cell cannot be fully realized offshore unless it is during operations. The issues with this method are: the reference load used will not be calibrated to a NIST or known standard; the load applied is not through the entire range; operation requires recalibration of draw-works when replacing a load cell or barrier; and the measurements are subject to field errors.
The obvious answer to the above problems is to use the original calibration. It is not clear why this is currently not always done. It can be speculated that it was used at one time, but if the measured and actual loads did not match the simplest solution in the field would have been to adjust the measurements to align with the test load on board the rig. This would then require an in-field ‘re-calibration’ to be done. As mentioned above there are also load cells that are installed at the deadline. These load cells will be less accurate as they are farther away from the measurement point. If both the load pin and the deadline load cell are installed, to ensure the measurements can corroborate one another the friction losses in the system need to be accounted for. A simple model we typically used to estimate some of these loses is shown in the following equation:
            [                        (                      e                          t              -              mech                                      N              l                                )                -        1            ]              [                        (                      e                          t              -              mech                                      N              l                                )                ⁢                              N            l                    ⁡                      (                                          e                                  t                  -                  mech                                            -              1                        )                              ]        =      e          rev      ⁢                          ⁢      _      ⁢                          ⁢      mech      where et-mech=Tackle Efficiency=1.015; N1=number of lines; Fhl=Hookload observed; Ffs=load on the fast line, whereFhl=FfsNlerev-mech 
The calculation above only addresses tackle efficiency, there will be other friction losses that will need to be accounted for. In 2012 Hookload was defined by U.MME with NTNU as “The sum of vertical components of the forces acting on the drillstring attached to the hook.” There is expected to be other friction losses, even for the becket pin style load cell installation. It is expected that they are relatively small, but those losses should be quantified. The importance of the hookload measurement in the control system is that it executes configured responses based on certain deviations of hookload during various operations. If the hookload values are not reliable this poses a challenge to the user as the system may not respond in a predictable manner.
First, it is important to recognize that some system suppliers have termed the infield rescaling of the load measurement a calibration, despite it is in fact not a calibration. An infield re-scaling is not sufficient and as a result is introducing unnecessary error into the load measurement. It can be argued that this error is sufficiently significant such that it has contributed to the necessity for recalibrations of multiple installations in the past. During factory testing these load cells pass through a series of tests. The pin is put through its usable range and the manufacturer generates a table which maps the electrical signals from the pin's strain measurement circuit to a real work load. This mapping is accomplished with a degree of accuracy by using a NIST traceable load cell.
Conventionally, there can be two or more “calibrations” performed for the draw-works load cells. The first calibration occurs at the factory where a load cell is exposed to the range of forces. The measurement of these forces is done with a NIST (National Institute of Standards and Technology) traceable load cell that permanently resides at the factory. For a specific pin and the electrical signal, these forces are captured during the factory calibration process and provided as a table with the load cell's certificates. The second calibration that occurs once the load cell is installed on board the vessel is a field calibration using a field procedure. To summarize the procedure, it attempts to expose the load cell as fitted in the draw-works with estimated loads as opposed to known loads (e.g. NIST). Another drawback is the load cell is not exposed to its entire range, but only a faction. The loads experienced by the crown or travelling block pins will not be equal across all load cells. This is due to the load distribution the sheaves and asymmetric friction losses from the mechanical coupling. This inequality may have cause contention with the original design and it was established to institute a field calibration.