Corrosion measuring and monitoring methods take several forms. Ultrasonic readings, pH monitoring, use of corrosion coupons, and linear polarization methods are only a few of the numerous methods now in existence. One method which has proved useful in measuring corrosion rate is the electrical resistance method, illustrated in U.S. Pat. No. 3,104,355 issued to E. A. Holmes et al in 1963. A particular advantage of the electrical resistance method has been its ability to accurately model corrosion from a particular corrosive environment over a long period of time. Typically, corrosion rate by the electrical resistance method is determined by inserting sacrificial electrical resistance elements, or probes, into a corrosive environment, such as a plant stream, at significant points and measuring actual metal loss from these probes. Measured metal loss from these probes or elements, which are constructed of the same materials as the plant, is an indication of metal being lost by the vessels, piping, etc. of the plant. As the element corrodes, the cross sectional area diminishes, increasing the electrical resistance of the element. The increasing resistance of the corroding active element is compared to a resistance reading from a reference element which is protectively disposed in the corrosive environment, for example, by comparison through such means as a balanced-bridge circuit. Both elements may be constructed out of materials with similar temperature coefficients, so the comparison provides some automatic temperature compensation to the corrosion measurement system. The principle being utilized is that, as the temperature of the environment changes, the corresponding resistance change of the reference element nullifies the resistance change of the active element due to temperature. This particular method of temperature compensation is more fully described in U.S. Pat. No. 3,104,335, previously referenced. The resistance of the element, after being partially compensated for temperature variation, is directly proportional to a corrosion dial reading. By plotting these dial readings over an extended period of time, for instance two weeks, a slope may result which corresponds to an average corrosion rate over that period of time. The resistance elements used as probes may be used in a number of environments and are available in a variety of alloys, each alloy corresponding with the particular metal whose loss is being determined.
The basic problem with corrosion rate determination by the electrical resistance method has been the inability to accurately measure the slight amounts of metal lost per hour or even per day, even though the cumulative effect over a longer period, such as one year, may be unacceptable. Metal losses for usual corrosion rates are on the order of millionths of an inch per hour with corresponding changes in probe resistance of tenths of a micro-ohm. The problem of accurately determining small amounts of corrosion resolves itself into one of a signal-to-noise ratio in which the "signal" to be preserved and enhanced is the change in probe resistance due to metal loss and the "noise" to be reduced is the composite of many extraneous effects, i.e., temperature, line voltage, and electrical pick up.
Heretofore, these extraneous effects causing "noise" have limited the use of the electrical resistance method to a time-average system of corrosion measurement as opposed to a real time "instananeous" corrosion measurement, since the best time resolution possible has been one to two weeks. Hence, successive readings of corrosion loss in units of inches of penetration are taken, and by dividing the inches of penetration by the time interval (typically one to two weeks) and applying the proper conversion factors, an average corrosion rate in mils per year (MPY-thousandths of an inch per year) may be calculated. While the probes may be designed using a reference element to provide temperature compensation, temperature effects remain as the single largest source of extraneous fluctuation or noise in the system. For instance, .degree. C. (centigrade) temperature change of the active resistance element has about 100 times the effect on probe resistance as corrosion for one hour at a 10 MPY rate. This may be partially compensated by a reference element, but a residual offset and resulting temperature effect, caused by such things as variations in manufacturing of the elements and minute variations in composition of the elements, exists in the active-reference element system. In addition to this residual imbalance effect which exists even when the active and reference element are at the same temperature, any difference in temperature between the elements, such as occurs during temperature changes of the corrosive atmosphere, will cause an even more pronounced imbalanced transient effect. The time-average method and apparatus for corrosion measurement is illustrated in U.S. Pat. No. 3,094,865 issued to Dravnieks et al in 1963, and U.S. Pat. No. 3,104,355, discussed above.