The use of stable isotopes for the determination of biological information has a long and illustrious history [see, Hellerstein, Metabolic Engineering 6:85-100 (2004)]. The oldest and most frequent such usage is in studies probing metabolism wherein a stable isotope is incorporated into a specific molecule at a specific location. This isotopically-labeled molecule, or “precursor”, is fed to an in vivo organism, in vitro cell system, or in vitro cell-free system for either a brief or extended period of time, after which the fate of the isotope is determined, either by use of NMR, mass spectrometry (MS), chemical degradation, or other detection technique.
In contrast to the use of radioactive isotopes, the use of stable isotopes is generally regarded as safe and free of regulation. Although in general, a study typically uses a single isotope incorporated into a specific location in order to achieve a precision in understanding the metabolic fate of a molecule, another embodiment of the use of stable isotopes utilizes wholly-labeled molecules (>99% of an atom is replaced with an isotopic equivalent), or universally-labeled (the isotope is universally distributed within the target molecule at less than saturation levels). There are many known studies in which more than one isotope is incorporated into a target molecule, and all of the isotopic fragments are examined for their differential fates. In all cases, these methods are targeted analyses; i. e., they seek the incorporation of a specific labeled atom into other specific molecules.
Yet another use of stable isotopically labeled compounds is as internal standards for their non-labeled counterparts. In such a use, an isotopically enriched molecule is added to a sample or extract at a known concentration prior to an analysis, and the final measurement determines the exact concentration of the non-labeled material by comparison. In this type of study, it is not uncommon for a researcher to add more than one isotopically-distinct standard if more than one molecule is to be quantified.
Indeed, there are extreme forms where one prepares an extremely complex mixture by growing a complex organism on an isotopically-defined feedstock such that the entire organism is heavily, if not entirely, composed of molecules consisting of only one isotope [Wu et al., Anal Biochem 336:164-171 (2005)]. In this situation, the same standard is introduced into all samples, but there is no information carried by the standard other than for purposes of relative quantitation; i. e., the standard has no relation to the experiment at hand. Historically, such standards are carefully constructed to differ from any other analyte by a specific mass difference.
In many areas of science the need for reproducible chromatographic separations is fundamental. However, the most common approach is to repeatedly test the equipment prior to running the sample to be analyzed, because the inherent variability of chromatographic systems is an unfortunate and unsolvable problem. There are solutions to this problem in which compounds not native to the injected sample are added or “spiked” at predetermined concentrations before injection, and used as reference points in the eluent stream. These compounds are referred to as chromatographic standards.
In all such cases, the time of elution (and possibly the quantitation) is/are then mathematically corrected according to the position (and size) of the standards peaks. When this is done then the chromatogram is based not on “retention time” but on “retention index”. This strategy works well when the standards can be easily identified and are separated from the other constituents of the sample either in time or other physical characteristic, such as mass.
Unfortunately this is all too frequently not the case. If another compound co-elutes from the column in the vicinity of the standard and shares any common ions, then the standard is unusable. For this reason, the use of several such standards is frequently required in order to assure that one or more of the standards will be useable.
Ion suppression is a phenomenon that occurs during the mass spectroscopic ionization processes when the efficiency of sample ionization is subjected to variability due to characteristics of the analyte compounds that are present. Thus, in its most common form, the number of molecules that could be ionized is in excess of the amount of charge available. In this situation the molecules that become ionized most efficiently are those that can acquire the charge most strongly, and the remaining molecules become ionized with much lower efficiency.
The present invention provides a method for creation and use of patterns of stable isotope as internal standards in mass spectral analyses. Thus, a contemplated method utilizes a compound with a predefined, unique and non-natural ratio of stable isotopes as a standard and thereby provides one solution to mass spectral analysis problems associated with ion suppression as wells as providing a more general standard that can be used in assaying multiple types of systems.