The operations of process plants have been leveraged by the availability of analytical methods, for example, through the use of laboratory measurements or online analyzers. These types of results are valued by plant operations because they are typically regarded as reliable. For example, with respect to hydrocarbon and/or refining operations, primary analytical test methods provide a critical basis for custody transfer of products whose properties have been ascertained in accordance with industry standard test methods such as those developed and promulgated by ASTM International.
Notwithstanding the importance of these primary test methods, they do suffer from certain inadequacies. For example, laboratory measurements typically may be available only once or several times per day. Furthermore, several hours can elapse between the obtaining of a discrete sample and the reporting of results from tests performed on it, severely limiting the possibility to control the process on the basis of those results. Additionally, sample stability, sample contamination, issues of representative sampling, and uncertainty associated with the execution of test procedures may result in erroneous sample values being returned.
Improving the availability, integrity, and reproducibility of test data has in many cases motivated the on-line deployment of measurements. However, depending upon the type of measurement and analysis being performed, the cycle time for online analyzers may be on the order of ten or more minutes and, in some cases, up to and over one hour, which, in some cases, may still be inadequate for purposes of maximizing process efficiency or product quality.
Process industries have conventionally responded to the time delay issues and reliability of primary measurements by instituting secondary measurement techniques capable of predicting properties of certain process streams. Such secondary techniques commonly have included the use of some form of model, for example, multivariate statistical models capable of predicting certain properties of interest using process inputs, in which the properties of interest may be termed “dependent variables” and the process inputs may be termed “independent variables.”
An important class of these model-based approaches is “Inferential analyzers,” also referred to as “soft sensors” because they typically reside in software. Soft sensors are appealing for at least two reasons. First, many times they do not require the installation of additional sensors in the process unit because they typically rely upon measurements such as temperature, pressure, and flow rate, which may already be available. Second, with the advent of distributed control systems, the input measurements typically relied upon by soft sensors are substantially available in real-time, having discrete sampling rates of one second or even less. These advantages at least address the disadvantage of time delay associated with primary measurements by providing property predictions at intervals that are greater than those typically required by process control systems. Additionally, they obviate the need to physically obtain a sample, eliminating the issues of representative sampling and sample integrity.
While the fidelity of these models may be quite good over limited time periods ranging from a few hours to even perhaps a few days, conventional inferential analyzers tend to be insufficiently robust because in aggregate the independent variables that serve as inputs into the model typically relate to the chemistry of the process stream both indirectly and incompletely. They are indirect expressions of the chemistry to the extent that the readings of sensors on the process are functions of both process conditions and material in the process; they are incomplete insofar as the number of independent variables used in the models is fewer than the degrees of freedom in the system, which relate to both the process system and the material being processed through it. However, an exception may occur when steady-state or quasi-steady-state conditions prevail and many process and stream variables are nominally constant, e.g. when feed quality and the operation of the process system are substantially invariant. At such times, the independent variables may “determine” stream chemistry in the mathematical sense, and property predictions by an inferential analyzer may be extremely reliable. Yet, a fundamental issue is that models generally are correlative, and because correlation does not necessarily denote cause, inferential models may be largely empirical, with first principles having only distant influence. Indeed, the literature freely refers to the modeling approach that is perhaps most common as a “black box method.” In summary, property predictions by inferential analyzers are labile to the extent that the effect (a predicted value) is removed from the primary cause (a stream property that ultimately is determined by sample chemistry).
The common practice therefore is to use periodic laboratory measurements to update the outputs from inferential analyzers to deal with a variety of variables. This strategy may improve the quality of inferential predictions across (i) the full range of possible feed qualities; (ii) changes in the condition of the process system, e.g. fouling; and (iii) changes in the response of sensors whose readings are the independent variables, e.g. those resulting from simple drift or from replacement of a faulty sensor with a new one. Nevertheless, issues attach to this approach which limit the possibilities for optimizing the performance of continuous processes, which may include maximizing throughput, adhering closely to product quality targets, minimizing energy usage, extending catalyst service life, and the like. A need remains in the art for improved model adaptation procedures, in particular, to accommodate variations in operating conditions resulting from, for example, changes in composition and properties of the feed; other process unit changes not within the scope of the model; sensor failure and/or discrepancy in process measurements; frequency of availability of laboratory analysis and/or measurements from analytical instruments for updating the model; and analyzer reliability.
Even small deviations in model accuracy can significantly impact the economies of production for large scale processes similar to those process units found in petroleum refineries and petrochemical plants. There remains a need in the art for improved methods of measuring properties of hydrocarbon streams, in particular, frequently, preferably on-line in substantially real-time. Furthermore, there remains a need in the art for improved methods of measuring properties of hydrocarbon streams with a high degree of accuracy.