Quantitative characterization of tissue structure and function is one of the most challenging problems in medical imaging. Diffuse optical methods can be used to measure biological tissues or other turbid (i.e. light-scattering) samples with resolution and depth sensitivity from microns to centimeter length scales, limited by fundamental light-tissue interactions. Important tissue components (referred to as chromophores) such as oxy-hemoglobin, deoxy-hemoglobin and water can be detected optically and correlate to various indicators or indices of local tissue health or physiological status. Examples of such indices include the tissue oxygen saturation (stO2, or fraction of oxygenated blood), total blood volume (ctTHb), tissue water fraction (ctH2O), and tissue perfusion or metabolism. These indices can provide a powerful means for physicians to perform diagnoses and/or guide therapies. These chromophores can be detected because they have absorption spectra with detectable features, in the visible and/or near infrared regions. In essence, a light source can be used to illuminate a tissue sample, and the remitted light can be used to measure the absorption features in tissue and quantify the chromophore of interest. Practically, this is a difficult measurement due to the presence of scattering in tissue. A class of probe-based technologies have been described in academia and have also been translated commercially by a number of companies (Somanetics, Hutchinson, ViOptix). Each of these technologies use a number of different algorithms and hardware components (illumination sources, spectral detection) to approach the problem to account, correct, or control for tissue scattering to derive meaningful information about hemoglobin and tissue oxygenation. These probes take advantage of the large selection of single point detectors that enable spectral flexibility and high sensitivity. However, contact probes suffer from some major limitations. By nature, contact probes are not imaging technologies and thus not ideal for assessing large areas of tissue. This is important because tissue health is often spatially variant, for example, in tissue wounds (burns, ulcers, skin flaps, etc.), where spatial contrast can be present both between normal tissue and the wound, as well as within the wound itself (e.g. wound boundary vs. wound center). With contact probes, in order to synthesize a low resolution image, multiple contact probes must be placed in a number of tissue locations, or the probe must be scanned across the surface. Typical wounds can vary from a few mm in size to many cm, presenting a challenge for probe technologies to design for, address, and/or adapt to this large range.
Camera based optical spectral imaging methods have also been developed in academia and commercially. A multi-spectral imaging technology using visible light (HyperMed) has been applied to measure tissue oxygenation over a wide field of view (˜10 cm×10 cm) and has been applied to monitoring of diabetic wounds. Multi-spectral imaging methods typically employ wavelengths which sample only top superficial (<1 mm deep) layers of tissue. While near-infrared (650-1000 nm) penetrates much more deeply, the chromophore contrast in the reflected or transmitted light signal is more challenging to isolate and quantify, due to the presence of a strong tissue scattering coefficient (i.e. compared to absorption). A technology that can overcome this limitation and assesses tissue health over a wide field of view in a non-contact manner both in superficial layers (˜100 μm deep) as well as subsurface layers (1-10 mm) is more valuable and is therefore desired.
A novel optical imaging method called Modulated Imaging (MI), which enables quantitative analysis of disease progression and therapeutic response in a wide field of view and depth of the tissue without requiring direct contact, was recently introduced. MI has been described in U.S. Pat. No. 6,958,815 B2, herein referred to as Bevilacqua et al, which is incorporated herein by reference. This technique comprises illuminating biological tissue or other turbid medium (a sample that is both scattering and absorbing) with a spatially modulated light (or “structured light”) pattern at one or more optical wavelengths and analyzing the resulting collected back reflected and scattered light from the tissue. A preferred embodiment of MI is called Spatial Frequency Domain Imaging (SFDI), in which the spatial light pattern, or structure, is sinusoidal, which provides an algorithmically simple way of detecting the structured light contrast from a small number (typically 3-15 per wavelength) of structured light measurements. When combined with multi-spectral imaging, the optical properties at two or more wavelengths can be used to quantitatively determine the in-vivo concentrations of chromophores that are relevant to tissue health, e.g. oxy-hemoglobin (ctO2Hb), deoxy-hemoglobin (ctHHb) and water (ctH2O).
In order to perform spectroscopic (wavelength-dependent) measurements of absorbing chromophores, the MI technique requires collection of remitted spatially structured light from tissue at various wavelengths. This has been accomplished to-date by repeating the disclosed technique of Bevilacqua et al for each desired wavelength. Thus, total imaging times scale directly with the number of wavelengths measured. This can be particularly challenging for some wavelengths in the near-infrared where illumination sources are less bright, optical throughput is low, and detector quantum efficiencies are low due to CCD limitations. For low throughput wavelengths, long integration times (10 s to 100 s of ms) are required to obtain adequate signal to noise ratio. Light intensity must be increased at those wavelengths in order to reduce integration time. However, this is limited by the etendue, or light throughput, limitations of structured light projection hardware, including that of both light source (e.g. LEDs, lasers, white light bulb), optical relay system (e.g. lenses, waveguides, mirrors), and pattern generation technology (e.g. reflective digital micromirror array or liquid-crystal-on-silicon, patterned transmissive material or LCD array, or holographic element). “Brute force” increases in intensity of weak or inefficient wavelength bands can have other effects including increased power consumption, increased thermal stress (which can lead to further source inefficiency and instability) and increased cooling requirements. Longer imaging times also create a practical issue in medical (or other motion-sensitive) applications as it leads to artifacts in the final image due to small movements of the measurement sample (e.g. tissue) under study. It is therefore desirable to provide an apparatus and method that improves the capability of the current modulated imaging methods while maintaining accuracy but improving system efficiency and reducing the imaging time.
As described briefly above, MI comprises illumination of a sample with one or more spatially structured intensity patterns over a large (many cm2) area of a tissue (or other turbid) sample and collecting and analyzing the resulting light received back from the sample. An analysis of the amplitude and/or phase of the spatially-structured light received back from the sample as a function of spatial frequency or periodicity, often referred to as the modulation transfer function (MTF) can be used to determine the sample's optical property information at any discrete wavelength. Examples of tissue optical properties include light absorption, light scattering (magnitude and/or angular-dependence), and light fluorescence. Analysis of this light-dependent data (model based or empirically-derived) can be used to generate 2D or 3D maps of the quantitative absorption (μa) and reduced scattering (μs′) optical properties. Region-wise (multi-pixel) assessments can also be produced by averaging or otherwise accumulating multiple spatial optical property or derived results. By using the spatial frequency or periodicity information at various wavelengths, MI can separate absorption (μa) and fluorescence (μa) from scattering (μs) effects, which each result from physically distinct contrast mechanisms.
Mapping the absorption coefficient, (μa), at multiple wavelengths, by MI, in turn, enables quantitative spectroscopy of tissue chromophores including but not limited to oxy- and deoxy-hemoglobin and water (ctO2Hb, ctHHb, and ctH2O) and derived physiology parameters such as tissue oxygen saturation and blood volume (stO2 and ctTHb). The spatially-varying phase of the light collected from the tissue can also be simultaneously measured, and yields topological surface information. This combination of measurements enables visualization of the 3D tissue profile, as well as calibration data for accommodating curved surfaces in the analysis. A typical data flow is shown in FIG. 1.
A present issue in measurement and analysis of MI is imaging time. Longer imaging times increase sensitivity to motion and ambient lighting, which can result in artifacts in the two dimensional maps of the measured biological metrics—particularly in clinical applications. Hardware limitations are a key cause for long imaging times. High power light sources, such as light emitting diodes (LEDs), can ameliorate the issue but measurement time remains an issue in the near infrared. This is because LED power and camera sensitivity can depend strongly on wavelength and LED power is limited by cooling requirements and size of the apparatus.
FIG. 2 shows an example dataset of an infant burn wound, collected with a prior art modulated imaging apparatus which exhibits motion artifacts. FIG. 2(b) shows reflectance data versus wavelength and spatial frequency. Note the artifact high spatial frequency striped pattern in the demodulated 970 nm data (right, bottom). Here the term demodulated data means the extracted amplitude of the light received from the tissue normalized to the amplitude of the light illumination at each spatial frequency. In other words, the demodulated data is the modulation transfer function of the illuminated tissue. These artifacts are due to motion during the long integration times required for this wavelength. As FIG. 2(c) highlights, a 10× longer integration time (i.e. 5 s) is required to acquire the data set at 970 nm compared to other shorter wavelengths (i.e. only 0.5 s). Using all wavelength information to produce chromophore or scattering amplitude/slope measurements results in sinusoidal artifacts in the derived data as shown in average scatter amplitude image in FIG. 2(d).
It has been shown that if the 970 nm wavelength measurement (and thus analysis of water concentration (ctH2O)) is excluded ctO2Hb and ctHHb can still be accurately calculated by assuming a typical tissue water fraction. FIG. 2(e) shows the resulting analysis when 970 nm data are excluded which correctly identifies a high-scattering region in the upper left corner of the infant's arm, indicated by the black arrow. This region corresponds to the most severe location of the burn and is useful to identify. However, water sensitivity is highly desirable in many studies, so excluding 970 nm data is not desirable.
In general, therefore, it is desirable to have the flexibility to capture spectral contrast measurements of target chromophores at various wavelengths, while simultaneously having minimal increases in complexity, if any, to the structured light requirements of the core modulated imaging technique. It is therefore desirable to provide an apparatus and a method to remove the effects of artifacts at wavelengths with poor performance/sensitivity in order to provide full information about the concentrations and/or distributions of all relevant components including ctH2O, ctO2Hb, ctHHb, and others (e.g. bilirubin, methemoglobin, lipids, exogenous agents).