Gas chromatography systems, particularly those integrated with mass spectrometry detectors (GC-MS), are undeniably the primary choice in instrumentation for the identification and quantitation of environmentally concerning volatile and semi-volatile organic compounds (VOC's and SVOC's). This prevalence can be confirmed by any modern environmental analysis laboratory where commercial GC-MS instruments have become a mainstay. However, such instrumentation requires meticulous calibration and consistent performance validation to ensure accurate and legally justifiable quantitative results. The calibration process can divided into three separate tasks including: periodic tuning of the instrument, calibration of analyte specific response, and routine quality control (QC) to affirm stable signal.
Classically, GC-MS tuning is performed in two separate stages with both the GC and MS being tuned independently. Contemporary capillary columns in gas chromatography systems are most commonly tuned in terms of linear retention index (LRI) by liquid injection of a wide range of n-alkane species. This process allows end users to normalize the retention time of unknown chemical species to the normal alkane mixture, thus aiding in the identification of these unknown compounds. Regrettably, a given LRI plot is specific to each individual instrument set-up using an explicit GC oven program. This specificity requires that a new LRI plot is generated anytime an instrument component or method parameter is changed requiring a new standard n-alkane mixture to be purchased or prepared.
MS tuning is most commonly performed automatically by direct introduction of a perfluoro-compound vapour into the MS sourced from a liquid calibration vial installed directly on the instrument. Although this process still needs to be performed quite regularly one vial will last for periods exceeding a month.
Following instrument tuning, proper calibration must be performed to adequately relate the arbitrary instrument response to the absolute amount of analyte present in a sample. Regardless of the separation method chosen, (GC or LC) the response of mass spectrometry based detection will vary with chemical structure of a given analyte requiring that different calibration curves, at multiple concentration levels, be prepared for each compound quantified. These curves must also be re-run anytime the signal from the instrument is found to drift as determined by proper QC analyses. With these factors considered it is not surprising that the modern analytical chemist spends more time preparing standard solutions for performing calibration than any other task in the analytical laboratory.
Proper experimental quality control may be considered the final crucial aspect in ensuring legitimate instrument tuning and calibration. QC analyses are intermittently run within a sample-set and the calibration in order to ensure the GC-MS or LC-MS is in-tune and that the detector response remains statistically constant during sample analyses after calibration. In many private and government laboratories some form of QC analysis may be performed as frequently as one in every ten runs. This frequency requires that the amount of analyte loaded from the QC source be as repeatable as possible. Such repeatability can become increasingly difficult in high throughput applications were classically, a single QC solution may be completely exhausted before the instrument signal has shifted requiring an analyst to prepare multiple, QC standards which may introduce unwanted preparative error. This limitation is especially true for volatile standards that cannot be made up in bulk lest analyte is unintentionally lost to the surrounding environment.
It would therefore be advantageous over the prior techniques to implement a calibration technology that could be used multiple times without showing significant depletion. Furthermore it would prove very useful if such a system remains stable over a prolonged period of time, even when highly volatile compounds are present. It is also important that said technology is able to load a quantity of standard that is low enough to be representative of the trace levels of analyte expected from environmental samples. As a final requisite, it should be possible for the device to be manufactured reproducibly at an industrial level. To address these requirements a simple, in-vial standard analyte generator is herein disclosed.