Recent developments in drilling optimization use real time analysis of the energy consumption of the drilling system to optimize the rate of penetration (ROP). Such optimization can provide instantaneous ROP increases of 100-400% and increases in footage per day. Similar results can be achieved in soft and hard formations, low and high angle wells, and with all rig types.
However, it is difficult to objectively assess operators' drill rate performance. that is, bits are often evaluated based on their performance relative to offsets, but drill rates are often constrained by factors that the driller does not control, and in ways that cannot be documented in a bit record. Consequently, drill rates may vary greatly between two wells running identical bits. The manner in which a bit is run is often more important than which bit is run.
Drillers conduct a variety of tests to optimize performance. The most common is the “drill rate” test, which consists of simply experimenting with various weight on bit (WOB) and bit rotational speed (RPM) settings and observing the results. The parameters that result in the highest ROP are then used for subsequent operations. In some sense, all optimization schemes use a similar comparative process. That is, they seek to identify the parameters that yield the best results relative to other settings.
One of the earliest schemes was the “drilloff” test, in which the driller applied a high WOB and locked the brake to prevent the top of the string from advancing while continuing to circulate and rotate the string. As the bit drilled ahead, the string elongated and the WOB declined. ROP was calculated from the change in the rate of drill string elongation as the weight declined. The point at which the ROP stops responding linearly with increasing WOB is referred to as the “flounder” or “founder” point. This is taken to be the optimum WOB. This process has enhanced performance, but does not provide an objective assessment of the true potential drill rate.