There has been a continuing interest in evaluating the long-term health effects of exposure to ionizing radiation. Early studies, in the 1950's, were primarily concerned with "lower than lethality" doses. As science soon discovered, lower than lethality by no means meant harmless. In recognition of the adverse health impacts of low levels of radiation, film badges have been used as a convenient way to monitor the level of radiation exposure of individuals. By monitoring the film badge worn by an individual, the amount of radiation to which that individual was exposed while wearing the film badge was determined. These devices have practical utility when there is adequate motivation to monitor the individual's exposure level, that is, when the risk of exposure to non-trivial amounts of radiation is known in advance.
However, individuals are frequently exposed to non-trivial doses of radiation without being monitored. A number of reasons exist for non-monitoring. First, the presence of radiation may not be recognized. This is particularly true in environments where no artificial source of ionizing radiation exists, such as in the home. However, as demonstrated by discovery of radon-rich environments, locations traditionally considered radiation-free may not be so. Second, even where the presence of radiation is recognized, its threat may not be viewed as sufficiently great to warrant monitoring. The triviality of exposure is determined in view of the dosage and frequency of exposure and the perceived danger associated with such exposure. Despite our present understanding of the danger of greater than background exposure, occasional exposures to radiation levels greater than background, such as those from medical and dental x-ray procedures, are presently not monitored. Regularly occurring exposures greater than background are considered non-trivial, and, where such exposures occur in the workplace, regulation requires that they be monitored with film badges. However, such has not always been the case. Consequently, the levels of past radiation exposure for many workers were not monitored.
For the reasons set forth above, accurate determination of cumulative past exposure for most individuals is presently impossible. Therefore, it is desirable to develop a method for determining prior exposure of an individual to radiation which does not rely on the individual having been contemporaneously monitored for exposure.
It is well documented that radiation can induce long-lived cellular alterations of the hematopoietic stem cells Akiyama et al., "Evaluation of Four Somatic Mutation Assays as Biological Dosimeter in Humans" in Radiation Research: A Twentieth-Century Perspective, Dewey et al., eds. Vol. II Congress Proceedings, New York: Academic Press, pp. 177-182 (1992)). Furthermore, studies have related carcinogenesis to the level of somatic mutation (McCann et al. "Detection of carcinogens as mutagens in Salmonella/microsome test: Assay of 300 chemicals," Proc. Nat. Acad. Sci. USA, 73:950-954 (1976); Clive et al., "Validation and characterization of the L5178Y/TK+/- mouse mutagen assay system," Mutation Research, 59:61-108 (1970)). Consequently, investigations of cumulative lifetime dosimetric methods have focussed on these stem cells and the materials produced therefrom.
Upon division, stem cells produce red blood cells, white blood cells or platelets. The identity of the cell produced and the rate of stem cell division are governed by the needs of the individual as expressed by regulatory chemicals. For example, stem cells of an individual suffering from blood loss (anemia) respond by dividing more rapidly, as well as increasing the production of red blood cells relative to white blood cells and platelets.
Several studies have focussed on the effect of prior radiation exposure on the bone marrow level of red blood cells precursors ("RBCp"). RBCp are cells produced by stem cells which are sufficiently developed to be distinguished from other cells produced by the stem cells, but which have yet to leave the bone marrow and enter the blood stream.
Gong et al., "A Method for Determining Residual Injury in the Hematopoietic System of the X-Irradiated Rat," Radiation Research, 37(3) :467-477 (1969) studied the level of RBCp produced in response to anemia induced by bleeding the individual animal. The anemic response, in terms of marrow level of RBCp, was depressed in individuals exposed to radiation relative to those not exposed. The study also found that the effect of radiation on anemic response was long-lived. As a function of time from radiation exposure, the depression of anemic response recovered logarithmically with a t.sub.1/2 in rats of 30 weeks, about one-fifth of the rat lifespan.
The method, though in principle useful in determining prior radiation exposure, is impractical for assessing the extent of exposure in a large population or in humans. First, the assay requires that the patient be bled to the point of illness to induce anemic response. Second, the method requires a somewhat complicated and often painful surgical procedure to extract bone marrow from the bone of the patient. Last, because the stem cells remain localized in the region where they are irradiated (Gong et al., "Iron Kinetics Effects of 88 Millirads: Partial-Versus-Total Body X-Irradiation," Cell Biophysics, 13:15-27 (1988)), the method determines only radiation exposure of the bone from which the bone marrow is extracted. If the prior exposure was non-uniform over the entire body, the method fails to indicate accurately the total radiation received by the patient as a whole.
A related method for evaluating prior radiation exposure is described in Gong et al. "Effects of Low-Level (1.0 R) X-Irradiation on the Erythroid-Response of the Rat Bone Marrow," Radiation Research, 65:83-97 (1976), based on the observation that, in non-anemic subjects, RBCp count increases with increased radiation exposure. The method does not require inducing an anemic response, but simply involves obtaining a bone marrow sample, determining the number of RBCp therein, and correlating this number to the increased numbers of RBCp observed in irradiated individuals. Although this method does not suffer the disadvantage of requiring inducing an anemic response, it still requires a bone marrow biopsy and fails to provide accurate assessments of total radiation exposure when exposure is non-uniform over the entire body.
RBCp levels in bone marrow (both under bled and non-bled conditions) as a function of radiation exposure were quantitatively studied in Gong et al., "The Effects of Low Dose (Less than 1 Rad) X-Rays on the Erythropoietic Marrow," Cell Biophysics, 5:143-62 (1983) ("Gong (1983)"). The results indicate that both observed effects of radiation (i.e. non anemic RBCp elevation and suppressed anemic response) could be accurately described by a linear-logarithmic dose-response curve. Further, both techniques showed the observable effect of radiation on RPCp decreased exponentially with time from exposure, with identical half-lives.
At a minimum, all dosimeters using RBCp's have two common disadvantages: a need to obtain a bone marrow sample and the inability to determine whole body radiation exposure from a bone marrow sample. In contrast, these disadvantages do not exist when blood cells, rather than marrow cells, are used as indicators.
Both erythrocytes and lymphocytes have been the focus of a number of studies for evaluating the extent of various somatic mutations. These studies have been summarized in Akiyama et al., "Evaluation of Four Somatic Mutation Assays as Biological Dosimeter in Humans" in Radiation Research: A Twentieth-Century Perspective, Dewey et al., eds. Vol. II Congress Proceedings, pp. 177-182, New York: Academic Press (1992). The ability to evaluate the extent of various somatic mutations, has become increasingly important in view of a growing interest in the spectrum of mutational lesions which can occur in mammalian somatic cells and in the role of various lesions in carcinogenesis. Studies using in vitro cell systems have provided direct molecular evidence for a large number of mutational mechanisms that can lead to stable phenotypic changes in somatic cells. Since many of these mutational mechanisms have been implicated in the development of specific human tumor types, measurements of the frequency of different classes of mutagenic events in normal human cells in vivo would facilitate assessment of health risks from these events. In addition, evaluation of the extent of somatic mutations provides a means for determining prior exposure to radiation.
Wijayalaxmi et al., "Measurement of spontaneous and X-irradiation-induced 6-thio-guanine-resistant human blood lymphocytes using a T-cell cloning technique," Mutation Research, 125:87-94 (1984) and Sanderson et al., "Mutations in human lymphocytes: effects of X- and UV-irradiation," Mutation Research, 140:223-227 (1984), relate to a lymphocyte hypoxanthine quinine phosphoribosyl transferase ("HPRT") mutation assay. Results for 127 A-bomb survivors showed a statistically significant dose-related increase in the number of HPRT-deficient mutants. However, the dose-response relation is quite shallow, 2.3.times.10.sup.-6 /Gy, necessitating the sampling of a very large number of white blood cells per individual, and rendering the technique impractical for large-scale surveys. Furthermore, because of in vivo selection against the mutant lymphocytes, the effectiveness of the HPRT assay is short-lived and useful only for 1-2 years following exposure.
Kyoizumi et al., "Spontaneous loss and alteration of antigen receptor expression in mature CD4.sup.+ T cell," J. Exp. Med., 17:1981-1999 (1990), discloses a dosimeter based on a lymphocyte T-cell antigen receptor ("TCR") mutation assay. As is well known, most normal T-lymphocytes have surface expression of CD3 complexes, consisting of CD3 and TCR.alpha..beta. chain heterodimer. Since TCR genes are functionally hemizygous, when a mutation occurs in TCR genes (.alpha. or .beta.), the CD3 complex cannot be expressed on the cell surface and such mutants are detected as CD3-negative cells among CD4-positive helper-inducer T-cells. The slope, about 10.sup.-4 /Gy, is approximately 10 times greater than that for T-cell HPRT mutants. Another advantage of the assay is the commercial availability of monoclonal antibodies which means that the assay can be completed in several hours. However, the TCR mutant has a half-life of 2 years, and, therefore, the assay is not useful as a lifetime dosimeter. The TCR assay is further limited by inaccuracies introduced by spontaneous TCR mutations which increase with increasing age even without radiation exposure.
Recently, Turner et al., "Mutation in human lymphocytes commonly involve gene duplication and resemble those seen in cancer cells," Proc. Natl. Acad. Sci. USA, 85:3189-3192 (1988), have developed the lymphocyte HLA-A locus mutation assay. The assay uses monoclonal antibodies specific to HLA-A2 or HLA-A3 plus its complement to kill normal cells expressing HLA-A2 or A3 antigen on the surface. Mutant cells lacking the antigen survive. The dose-response relation is reportedly 3.1.times.10.sup.-5 /Gy. The half-life of the mutants, however, is approximately that of the TCR mutants. Therefore, the HLA-A mutation assay is not suitable for use as a lifetime dosimeter.
All these methods which use white blood cells require a large sample of blood, because only 0.01% of blood cells are white cells. This disadvantage, in addition to the short-lived nature of these assays, has been overcome, to some degree, by Langlois et al., "Measurements of the frequency of human erythrocytes with gene expression loss phenotypes at a glycophorin A locus," Hum. Genet. 74:353-362 (1986). Langlois determines prior exposure by detecting the loss of gene expression at the glycophorin A locus in human somatic cells. Glycophorin A ("GPA"), a cell-surface sialoglycoprotein on erythrocytes, occurs in two allelic forms, M and N and is the product of codominantly expressed alleles on chromosome 4. In the GPA expression-loss assay, pairs of monoclonal antibodies specific for individual allelic forms are each conjugated with a different fluorescent dye and used to label fixed erythrocytes from heterozygous MN donors. Flow cytometry and sorting are used to enumerate and purify rare, single-color cells that lack the expression of one of the two GPA alleles. Presumably, these cells lack expression because they are progeny of mutated erythroid precursor cells. The GPA assay revealed a persistent presence of a stem cell mutation approximately 50 years after irradiation, which makes it the only presently available mutation assay system having potential to be used as a lifetime dosimeter.
However, the assay has limitations. One drawback relates to uncertainty in the high dose limit. Since the assay measures the presence of GPA protein and since GPA decreases with increasing exposure, statistical error increases with increasing dose. At high doses, where loss of gene expression is essentially complete and where there are vanishingly few cells expressing GPA, statistical anomalies become overriding factors. Another disadvantage is that the GPA assay is useful only in individuals who are heterozygous for the MN blood type, which is only about 50% of the human population. Furthermore, the GPA assay requires that 5 million cells be counted to identify between 20 and 900 variant GPA cells in an exposure range of 0 to 300 cGy. Because of the limitations of the cell cytometry technology, each sample takes approximately 1/2 hour to complete. This counting time, though short relative to the counting times encountered in the lymphocyte assays, discussed above, makes the GPA method unsuitable for screening large populations.
Therefore, there remains a need for a lifetime biological dosimetry process that is based on a readily available body fluid, that is uniformly sensitive to dose over the range from 0 to 600 cGy, and that is suitable for screening of large populations.