In the clinical laboratory today, the majority of serum, blood and urine assays are performed using techniques that employ analyte specific reagents to produce color reactions. These tests play a critical role in diagnosing and monitoring a wide variety of disorders with a hospital typically running many thousands of such assays every month.
Reagent-free infrared (IR) spectroscopic analysis of dried films is a technique that enables the accurate and cost-effective analytical testing of key serum, blood and urine assays, as well as for certain other niche applications [1-11]. This method offers several potential benefits over standard analytical methods: no purchase or storage of reagents is required; several analyte levels may be determined simultaneously from a single spectrum; the technique is non-destructive allowing additional measurements if required; linearity is routinely available over the full range of analyte levels; the use of dried films allows for simple shipment of samples; the method lends itself readily to automation and high volume repetitive measurements; and minimal technical expertise is required of the operator. The method has successfully measured several key serum and urine analytes with accuracy sufficient for routine clinical analysis
A key characteristic of this approach is that it eliminates the difficulties associated with strong water absorptions, quantification of analyte levels or diagnoses being most effectively determined via multivariate analysis [1,2], (although any data analysis techniques that extract quantitative or qualitative information from IR spectra are equally applicable). The procedure used to derive a new IR-based analytical method begins by acquiring spectra for a set of samples (typically 200-300), together with the corresponding quantitative analyses for the components of interest (as provided by established analytical methods). The samples are then divided into a training set comprising approximately two thirds of the available samples and a test set made up of the remaining third of the samples. For each analyte of interest, a quantification algorithm is derived via a regression analysis routine, typically using a partial least-squares approach that requires the set of training spectra and corresponding analytical levels as input. As a final gauge of its accuracy, the newly derived algorithm is used to predict analyte levels for the set of test samples. The predicted analyte levels are then compared to their true values. The approach used to develop a new IR-based diagnostic classification test parallels that used to develop analytical methods. First, the spectra for a large number of samples are accumulated for each of the disease categories of interest. Pattern identification software is then used to discover an algorithm that optimally distinguishes disease from control spectra. The classification algorithm is used to predict diagnoses based upon the spectra for an independent set of test samples and the predicted diagnoses compared to true diagnoses.
A number of diagnostically relevant analytes presently elude detection by IR spectroscopy due to insignificant contributions to the overall IR profile. This is due in part to their relatively low concentration and in part to coincidence of their absorption bands with those of other strongly absorbing constituents. Since all IR absorbing compounds contribute absorption patterns that are superimposed onto one another, the measured absorption profile is typically dominated by a small number of the most concentrated analytes. This imposes a constraint on how much material may be dried to provide a useful IR spectrum; beyond a certain limit, absorption by the predominant compounds completely blocks incident IR light, effectively impeding spectroscopic determination of analytes below a certain concentration threshold. This in turn places a clear lower limit on the concentration range of analytes that may be determined by IR spectroscopy of multi-constituent samples. However, lower concentration analytes would be accessible simply by drying a greater amount of the (bio)fluid under investigation (longer effective optical pathlength) if the dominant constituent's influence were to be considerably reduced or removed.
A microfluidic sample preconditioning technique, often referred to as laminar fluid diffusion interface (LFDI) [12-26], provides the basis to accurately quantify analytes that are otherwise inaccessible to reagent-free IR spectroscopy, while preserving all of the advantages of reagent free spectroscopy as discussed above.