This invention relates generally to the extraction of heavy oil and bitumen. Heavy oils are crude oils, which have high specific gravity and viscosity and are therefore difficult to extract commercially because they do not readily flow. Tar sands are geological formations in which heavy oil is trapped within a sand formation. Achieving in situ separation of the heavy oil from the sand is a well-known and difficult challenge.
Currently steam is the dominant thermal fluid used for in situ recovery of bitumen and heavy oil. Injected steam raises the temperature of the bitumen thereby reducing its viscosity and allowing it to flow more easily. Steam extraction is subject to a number of problems including high heat losses, clay swelling problems, thief zones, water-oil emulsions, capillary surface tension effects, lack of confinement for shallower zones and disposal of large quantities of environmentally damaging salt and organic acids as a consequence of boiler feed water purity requirements. By some estimates, with the best currently available technologies, only 10% of the original bitumen resource in the Athabasca tar sands are economic to extract.
Thermal recovery processes, using steam, also require large amounts of fuel to be burned to produce the steam and can emit enormous amounts of greenhouse gases such as carbon dioxide. Estimates published by Natural Resources Canada1 show CO2 emissions of about 70 kg/bbl for bitumen production and a total of about 120 kg/bbl for synthetic crude (i.e. upgraded bitumen usually derived from surface mined bitumen). 1 Canada's Emissions Outlook: an Update, December 1999, Annex B, pg B-6, Available at www.nrcan.gc.ca/es/ceo/update.htm
Recent estimates released by the Alberta Energy Utilities Board2 and the Canadian Association of Petroleum Producers3, predict that bitumen (and synthetic crude) production rates will be 2 to 2.6 million bbl/day of bitumen by 2010. This level of bitumen production will produce at least 140 million kilograms (=70×2 million) of CO2 emissions per day (i.e. 300,000,000 to 700,000,000 lbs CO2 per day depending on fuel source and the proportion of in situ vs synthetic crude production). 2 Alberta's Reserves 2000 and Supply/Demand outlook 2001-2010, Alberta Energy Utilities Board3 Canada's Oil Sands Development delivered by Eric Newell, Chairman & CEO, Syncrude Canada. Available at http://www.capp.ca/
Solvent extraction processes have been proposed as an alternative to steam extraction processes. One such process is the N-Solv process (Canadian Patent Applications, 2299790, 2351148, 2374115). However, the physical chemistry of the bitumen extraction in solvent gravity drainage processes is not very well understood or characterized. For example, Dunn4 et al first reported in 1989, that for a cold solvent extraction process the measured CO2 diffusion rates in the tar sands were a factor of 460 times higher than the theoretically predicted values. This unexpected result has been observed and reported by many subsequent researchers using a variety of different solvents and crude oil samples and yet the underlying physical mechanism is still not understood. 4 Dunn, Nenniger and Rajan, A Study of Bitumen Recovery by Gravity Drainage Using Low Temperature Soluble Gas Injection, Canadian Journal Of Chemical Engineering Vol 67, December 1989, pg 985
Several computer models have been developed to predict the extraction rates for gravity drainage bitumen extraction using solvent. However, these computer models do not appear to be capable of accurately describing the in situ processes. One potential problem of such models is a lack of spatial resolution because the models are typically far too coarse to accurately model the solvent concentration gradients. For example, lab studies by Fisher5 have revealed that the solvent-bitumen interface is only a couple of millimeters thick. An appropriate gridblock size to accurately model in situ concentration gradients should be perhaps 10 times smaller (i.e. 100-200 microns). With typical grid block sizes of ˜0.5 m used in computer modeled reservoir simulations (see Nghiem6), the calculated concentration gradients in the reservoir simulators are about 500 times smaller than values actually measured in the laboratory tests. For a 3D computer model with an appropriate spatial resolution the number of calculations increases by a factor of 50003 (=125,000,000,000), which increases the model run time for a given scenario to an unworkable duration. Since the solvent concentration gradient provides the primary driving force for solvent penetration and extraction, existing computer models have a significant problem in accurately representing the process. 5 Fisher et al, ‘Use of Magnetic Resonance Imaging and Advanced Image Analysis as a Tool to Extract Information from a 2D Physical Model of the Vapex Process”, Society of Petroleum Engineers Paper 59330, April, 20006 Nghiem et al “Modelling Asphaltene Precipitation and Dispersive Mixing in the Vapex Process, SPE paper 66361, FIG. 2
As noted earlier, researchers consistently measure bitumen extraction rates with solvents that are much higher than expected. Thus, we believe, it is necessary to use physical models (i.e. experiments) to obtain meaningful data on bitumen yield, extraction rate and bitumen quality. Furthermore, until the details of the solvent extraction mechanism are better understood, we believe that it is unrealistic to expect credible predictions from the existing computer models.
Due to the complexity of physical processes (combined heat, mass and momentum transfer with simultaneous asphaltene precipitation) it may not be possible to ever develop a fully rigorous theoretical computer model. However, empirical models can be developed that are both accurate and useful. Such empirical models typically require data from a large number of representative physical experiments to be able to develop parametric sensitivities to process variables. This type of experimentation is expensive and time consuming, but has provided the basis for many (if not most) useful chemical engineering processes. However, it is necessary to conduct physical experiments which accurately represent the specific physical processes of interest and it is necessary that the same be accurately measured before a meaningful empirical model can be developed.
The prior art experimental apparatuses and techniques in the tar sand extraction field are generally intended to simulate a small two dimensional slice of a reservoir. These experiments are typically conducted in a thin walled rectangular can that is packed with tar sand (see Dunn4 and Frauenfeld7) with physical properties relevant to the reservoir of interest. A simulated injector well is usually located above a simulated producer well at one end of the can. The can is placed within a pressure vessel and external pressure is applied to the can to mimic the overburden stresses appropriate to that reservoir. 7 Frauenfeld et al., Evaluation of Partly miscible Processes for Alberta Heavy Oil Reservoirs, Journal of Canadian Petroleum Technology Vol 37 no 4, 1998
Tar sand extraction processes are typically based on some type of thermal effect and therefore appropriate consideration of thermal effects in the experiments is important. Two aspects of thermal behavior have been identified that can greatly affect the experimental modeling. First, there is a need to mimic to temperature profiles and temperature gradients within the tar sand which arise due to the thermal characteristics of the reservoir extraction process. Second, heat may be lost through the conductive nature of the can or sample holder of a typical experimental apparatus with the consequent distortion of the temperature profiles within the sandpack. Such heat loss is referred to as parasitic heat losses,
Parasitic heat loss is an ongoing problem with all thermal gravity drainage experiments. Typically, SAGD researchers have used the dimensional scaling criteria of Butler, to work around this problem. Butler's scaling criteria predicts that by increasing the tar sand permeability, the time scale can be compressed (i.e. 1 hour of experimental time corresponds to 1 year of field time). Thus, scaled experiments minimize the impact of parasitic heat losses by greatly reducing the experimental time. Butler has used the analogy between heat transfer and mass transfer to develop similar scaling criteria for solvent processes8. However, as noted above, the solvent extraction mechanism is not well understood so the scaling assumptions of Butler's solvent model are in doubt. Thus a different approach from the prior art scaling assumptions of Butler is needed. 8 Butler et al, A New Process (Vapex) for Recovering Heavy Oils by Using Hot Water and Hydrocarbon Vapour, Journal of Canadian Petroleum Engineering January-February 1991, vol 30, No. 1
FIG. 1 is based on prior art and illustrates the problem that the present invention seeks to address. FIG. 1 shows transient (one dimensional) temperature profiles at different times for a section of tar sand initially at 8 C when one edge is suddenly heated to 50 C. In FIG. 1 zero on the x-axis represents a bitumen interface. FIG. 1 shows the temperature profiles along a 60 cm section of tar sand initially at 8 C after time intervals of 1 minute, 2 hours, one day, three days and seven days from when one edge is suddenly heated to 50 C. The physical properties of the tar sand and the temperature profiles were calculated using the data and formulas presented by Birrell9. FIG. 1 also shows a temperature profile expected for a continuous bitumen extraction process taking place in 8 C tar sand with the interface heated to 50 C and an assumed extraction rate of 5 cm/day again using the formulas presented by Birrell9. This latter temperature profile is referred to as a quasi-steady state profile, as it is not expected to change further over time (since the x-axis origin is the bitumen interface). FIG. 1 shows that it takes a period of seven days for a sample sandpack to acquire a smooth temperature profile (assuming no parasitic heat losses out the sides of the apparatus), but even then it does not have the same temperature profile predicted for “quasi-steady state” operation. This figure shows that accurate process measurements cannot be made until the temperature profile no longer is changing over time, which even in a small sample can take a very long time to be established. 9 Birrell, Heat Transfer Ahead of a SAGD Steam Chamber: A Study of Thermocouple Data From Phase B of the Underground Test Facility (Dover Project), Journal of Canadian Petroleum Technology, March 2003
At quasi-steady state conditions, the bitumen interface moves with a constant velocity and the solvent condenses at a constant rate determined by the temperature gradient (i.e. conduction heat loss) at the bitumen interface. If we consider the temperature gradient at the bitumen interface, (i.e. slope of the temperature profile at x=0), then FIG. 1 shows that temperature gradient is far too high, and consequently the solvent condensation rate will be substantially in error for the first seven days of an experiment.
In addition to the problem of achieving quasi-steady state temperature profiles, the parasitic heat losses can be 10-100 times larger than the expected heat delivery rate. If solvent condensation is the only source of heat in the experiment, then these parasitic heat losses result in solvent condensation rates 10-100 times too high. High solvent condensation rates are undesirable and can lead to a host of complications including flooding of the vapour chamber with liquid and destabilization of asphaltenes. Thus, it is important to minimize the parasitic heat losses and correctly approximate the quasi steady-state temperature profiles for an accurate simulation to occur.
Thus, in the absence of appropriate scaling assumptions, which shorten the time of the physical experiments, real time experiments are required. In real time experiments, temperature effects become of much greater concern and represent significant limitations on experimental accuracy. What is needed is an experimental technique and apparatus in which the parasitic heat losses and temperature profiles are controlled in a way that permits measurements to be made which accurately reflect in situ circumstances, without an undue amount of time being required. The data generated by such techniques can then be used to develop accurate empirical models.