1.1. Field of the Invention
The present invention relates generally to a method and an apparatus for the reliable detection and quantification of the flow rate produced by a leak from pressurized pipeline systems containing petroleum, solvent, or other chemical liquids.
1.2. Brief Discussion of the Prior Art
There are a wide variety of pressurized pipeline systems carrying petroleum, solvents, and other chemical products that may contaminate or seriously damage the surrounding environment in the event of a leak. In underground or underwater pipelines, where visual inspection is not possible, a leak can be a significant problem. Small leaks in these pipeline systems (e.g., several tenths of a gallon per hour) can go undetected for long periods of time and result in a large cumulative release of product into the soil or groundwater, or into fresh or ocean water.
The need for leak detection capability in pressurized pipelines associated with underground storage tanks containing petroleum products has recently been identified. This need is an important one because the number of tanks involved is very large, and so is the volume of product dispensed through the pipelines associated with these tanks. The pipeline systems in question are most commonly made of steel or fiberglass; they are typically 2 in. in diameter, 50 to 200 ft long, buried 1.5 to 3 ft below grade, and are pressurized at 20 to 40 psi while product is being dispensed. In September 1989, the United States Environmental Protection Agency (EPA) issued technical standards for the detection of leaks in underground storage tanks containing petroleum or other hazardous chemicals and solvents. This regulation established the minimum performance standards that must be met by any leak detection system designed for testing the integrity of underground tanks and/or the pressurized pipelines associated with these tanks.
The EPA requires that underground storage tank (UST) pipeline systems that contain petroleum products be tested for leaks either on a monthly or an annual basis. To satisfy the criterion for monthly testing, a system must have the capability to detect leaks as small as 0.20 gal/h with a probability of detection (P.sub.D) of 0.95 and a probability of false alarm (P.sub.FA) of 0.05. To satisfy the criterion for annual testing, a system must be capable of detecting leaks as small as 0.10 gal/h with the same P.sub.D and P.sub.FA required of the monthly test.
There have been a number of approaches to leak detection in pipeline systems. Some leak detection systems are designed to operate while product is being moved through the line; others require that the flow of product be stopped for the duration of a test. Leak detection systems generally use one of three methods; they measure the drop in pressure in the pipeline over a period of time, they measure the difference in pressure or flow rate at two or more points along the pipeline, or they measure the change in the volume of the product over a period of time. Detecting small leaks is difficult because there are many physical phenomena present in pressurized pipeline systems that produce pressure, volume, and flow-rate fluctuations that are as large as or larger than those produced by a leak. These normally occurring fluctuations degrade the performance of the leak detection system and result in false alarms or missed detections. As a consequence, a number of compensation schemes have been proposed to reduce them.
1.2.1 Pipeline Leak Detectors that Measure Pressure
The most common approach to the detection of leaks in a pressurized underground pipeline containing an incompressible fluid at rest is to relate the pressure drop in the line to the flow rate of the leak. A leak in the line is declared if the pressure drops by a specified amount over a given period. If this specified amount, or threshold, is not exceeded, the line is declared tight. Pressure tests are very difficult to interpret because the pressure drops are coupled with the properties of the pipeline itself. Thus, a similar pressure drop in two different pipeline systems should not necessarily be interpreted in the same way. Experimental measurements with controlled leaks indicate that (1) the pressure decreases exponentially with time as product is released from a line, (2) the volume released from a line decreases linearly with pressure when no vapor is trapped in the line, and (3) the leak rate decreases exponentially with pressure. The relationships between pressure and (1) volume, (2) leak rate, and (3) time are controlled by the elasticity of the pipeline system. The properties of the line are usually measured in terms of the bulk modulus, which is the inverse of the elasticity constant. As the elasticity of the line increases, the time required for the pressure to decay from the operating pressure of the line to zero (or to any other pressure below the operating pressure) increases. In one line it might take 15 min for the pressure to drop 10 psi when there is a leak of 0.1 gal/h (defined at the operating pressure of the line), while in another line it might take 60 min. If the length of the test is defined as 15 min, the test protocol will prevent the sensor from detecting a 0.1-gal/h leak in some of the lines that are tested.
Some of the pressure changes that occur in pressurized pipelines are not associated with a leak. The most important are those associated with the thermal expansion or contraction of the liquid, the trapped vapor, and the pipe material itself. Experimental measurements in underground pipeline systems containing petroleum indicate that the pressure changes are directly proportional to the temperature changes and the bulk modulus of the pipeline system. These temperature-induced pressure changes occur frequently in both leaking and nonleaking pipelines. When the pressure changes in a leaking pipeline are no greater than these normally occurring temperature-induced changes, it is difficult to detect a leak by monitoring the line for drops in pressure.
Accurate detection of a leak demands (1) that both the instrumentation and protocol have sufficient sensitivity to detect the smallest leaks of interest, (2) that the temperature changes in the line be measured and compensated for, and (3) that the pressure changes be related to the flow rate of the leak. All three require that the range of the elasticity properties of the pipelines that will be tested be known. The second requires that the temperature of the product be measured. The third requires that the pressure-volume relationship be measured each time for each line being tested.
1.2.1.1 Bulk Modulus
The bulk modulus of a pipeline is defined by the relationship between pressure and volume within that line. The bulk modulus of both the line and the product must be known before one can convert the pressure and temperature changes to volume changes or before one can interpret the meaning of a pressure drop. One can estimate the bulk modulus by simultaneously measuring the pressure of the line and the volume of product released through a valve in the line. Errors in determining this relationship occur if the line is leaking, if the temperature of the product in the line is changing, or if vapor or air is trapped in the line. Accurate calibration is difficult because the integrity of the line is unknown, as are the temperature of the product in the line and the volume of trapped vapor. Furthermore, the bulk modulus of the pipeline system changes over time as the volume of trapped vapor and air changes, and as the elasticity of the flexible hosing, the mechanical leak detector, and the pipe material changes.
1.2.1.2 Thermally Induced Pressure Changes
Thermally induced fluctuations in pressure are the major source of error in detecting a liquid leak with a pressure detection system. The magnitude of the error depends on the magnitude of the coefficient of thermal expansion and the bulk modulus of the liquid and the line material. For gasoline motor fuels, whose coefficient of thermal expansion is 6 to 7 times larger than that of water, even small temperature changes have been shown to produce large pressure changes (e.g., a 0.1.degree. C. fluctuation in temperature can cause the pressure to change by 10 psi). Furthermore, both theoretical and experimental analysis demonstrate that the rate of change of temperature in an underground pipeline system can be high and complicated.
From the standpoint of petroleum-dispensing operations, it is difficult to distinguish temperature-induced pressure changes from those that are leak-induced, because the rate of change of pressure vanes exponentially both with the volume of product released through a hole in the line and with the change in the temperature of the product. The temperature of the product varies exponentially when product from the tank is brought into the line, because the temperature of this product differs from the temperature of the backfill and soil around the pipeline. This temperature difference, which can be many degrees, results in an exponential change as the product in the line attempts to come into equilibrium with its surroundings. In lines that are 100 to 200 ft long and 2 in. in diameter, it may be 6 to 12 h before the rate of change of temperature is low enough to permit accurate testing.
The traditional methods of compensating for temperature effects, which require the measurement of the rate of change of temperature of the liquid and the pipeline, are impractical because (1) the temperature distribution of the product in the line is spatially inhomogeneous, and a large number of temperature sensors would have to be retrofitted along the line in order to measure it: and (2) installing, maintaining, and calibrating a large number of sensors would be difficult. The best method of compensating for the effects of temperature fluctuations is to wait until these fluctuations are small enough to be negligible. For accurate pressure tests, this waiting period should be between 6 and 12 h.
1.2.1.3 Summary
Detecting small leaks in a pressurized pipeline by monitoring the pressure changes in the line is very difficult. High performance requires (1) that the test be long enough to allow the pressure to drop by a specified amount, suitable for detecting the smallest leaks of interest over the full range of pipeline systems to be tested, and (2) that the waiting period between the last dispensing of product and the beginning of the test be long enough for the temperature changes in the line to become negligibly small. To obtain accurate results in the case of the 2-in.-diameter lines found at a typical retail service station, dispensing operations might have to be terminated up to 12 h before beginning the test. Thus, the total time required to conduct a test becomes quite long.
1.2.2 Pipeline Leak Detectors That Attempt to Compensate for Thermal Changes
In U.S. Pat. No. 4,608,857, Mertens describes a method for detecting leaks as small as 1 L/h in a pressurized pipeline without waiting for fluctuations in the temperature of the product to subside. (As we have seen, such fluctuations induce pressure changes that can be mistaken for a leak.) Mertens establishes three measurement periods of equal length. Initial line pressure is the same during the first and third periods but is lower during the middle period. Pressure changes are measured during all three periods. The middle measurement is then subtracted from the average of the fast and third. The difference is compared to a threshold, and in this way the existence of a leak is determined. Mertens indicates that the volume of product in the line must be small for the method to work properly. Furthermore, according to Mertens, the method accurately compensates for temperature providing that "the sum of the consecutive measurement periods is very small compared to the half value period of a temperature equalization process."
Analysis of this method shows that, when a leak is present in the line, the average pressure change that occurs during either the first or third periods will always be greater than that during the middle period. Furthermore, depending on the bulk modulus of the pipeline system, the actual volume change that occurs during these measurement periods will vary from one leaking line to another, even when these lines have the same initial starting pressures and identical leaks. Mertens's method does not require that the bulk modulus be measured and does not attempt to interpret the test results in terms of the actual leak rate. Mertens's method declares a leak in the pipeline if the difference between the high- and low-pressure measurements exceeds a predetermined threshold value. However, a wide range of volume changes could produce this same pressure change, and therefore, the accuracy of his method will vary from line to line.
1.2.3 Pipeline Leak Detectors That Attempt to Detect Leaks While There Is Flow in the Line
The method described by Mullen in U.S. Pat. No. 3,702,074 detects leaks in pressurized pipelines while product is flowing through the line. Mullen measures flow rate at two different points along the line (either the inlet and the outlet or any other two points sufficiently distant from one another) and at two different pressures, one high and one low. The difference in flow rate between the two measurements made at the lower pressure is subtracted from the difference between the same measurements made at the higher pressure. The result is then compared to a threshold leak rate, which, if exceeded, is the basis for declaring a leak in the pipeline. Mullen contends that because his measurements are closely spaced in time, he prevents long-term dynamic trends, such as those produced by the thermal expansion and contraction of the product, from affecting the results. However, while the temperature changes, the rate of change remains the same. For example, if measurements are made one minute apart the temperature change is much less than if they are made one hour apart; however, the rate of change is the same over any interval, whether it is a minute or an hour. Mullen's approach does not work because it confuses the rate of change with the actual change, which has no bearing on the results. Mullen's method will effectively compensate for temperature changes only if they happen to be the same during the high- and low-pressure measurements. This is unlikely to be the case, however, because, as stated above, the change in temperature in a pipeline is generally not constant (i.e., it tends to be exponential with time). Furthermore, the fact that Mullen does not account for inventory changes also affects the accuracy of his method. Mullen minimizes short-term transient effects, such as those due to pressure, by taking several readings at each pressure and averaging them. By isolating different sections of line and by repeating the test at each segment of the line, he can locate the leak. He eliminates false alarms due to faulty equipment by comparing the test results for each segment of pipe tested; if the equipment is faulty, the flow-rate threshold will be exceeded in all of the segments tested.