Spectroscopic imaging combines digital imaging and molecular spectroscopy techniques, which can include Raman scattering, fluorescence, photoluminescence, ultraviolet, visible and infrared absorption spectroscopies. When applied to the chemical analysis of materials, spectroscopic imaging is commonly referred to as chemical imaging. Instruments for performing spectroscopic (i.e. chemical) imaging typically comprise an illumination source, image gathering optics, focal plane array imaging detectors and imaging spectrometers.
In general, the sample size determines the choice of image gathering optic. For example, a microscope is typically employed for the analysis of sub micron to millimeter spatial dimension samples. For larger objects, in the range of millimeter to meter dimensions, macro lens optics are appropriate. For samples located within relatively inaccessible environments, flexible fiberscope or rigid borescopes can be employed. For very large scale objects, such as planetary objects, telescopes are appropriate image gathering optics.
For detection of images formed by the various optical systems, two-dimensional, imaging focal plane array (“FPA”) detectors are typically employed. The choice of FPA detector is governed by the spectroscopic technique employed to characterize the sample of interest. For example, silicon (“Si”) charge-coupled device (“CCD”) detectors or CMOS detectors are typically employed with visible wavelength fluorescence and Raman spectroscopic imaging systems, while indium gallium arsenide (“InGaAs”) FPA detectors are typically employed with near-infrared spectroscopic imaging systems.
Wide-field spectroscopic imaging of a sample can be implemented by collecting spectra over the entire area encompassing the sample simultaneously using an electronically tunable optical imaging filter such as an acousto-optic tunable filter (“AOTF”) or a liquid crystal tunable filter (“LCTF”). Here, the organic material in such optical filters are actively aligned by applied voltages to produce the desired bandpass and transmission function. The spectra obtained for each pixel of such an image thereby forms a complex data set referred to as a hyperspectral image which contains the intensity values at numerous wavelengths or the wavelength dependence of each pixel element in this image.
Spectroscopic devices operate over a range of wavelengths due to the operation ranges of the detectors or tunable filters possible. This enables analysis in the Ultraviolet (“UV”), visible (“VIS”), near infrared (“NIR”), short-wave infrared (“SWIR”), mid infrared (“MIR”) wavelengths, long wave infrared wavelengths (“LWIR”), and to some overlapping ranges. These correspond to wavelengths of approximately 180-380 nm (“UV”), 380-700 nm (“VIS”). 700-2500 nm (“NIR”), 850-1800 nm (“SWIR”), 650-1100 nm (“MWIR”), 400-1100 (“VIS-NIR”) and 1200-2450 (“LWIR”).
During spectral analysis, data is subjected to various optical artifacts. For example, putting any component in the path of light may disrupt the efficiency of detection. As a result, the data needs to be corrected to remove these effects because when efficiency of an instrument is affected, any data generated will similarly be affected.
In the case of Raman imaging data the real physical phenomenon being measured is the Raman scattered light emanating from a location in a field-of-view represented by a pixel in a data set. The Raman scattered light passes through a set of imaging optics to a detector. In general the optics are fixed components made of solid materials with stable optical characteristics. In full field-of-view Raman imaging of tissues, one of the optics is a liquid crystal tunable filter spectroscopic imaging element. This is a dynamically tunable narrow bandpass (˜0.25 nm FWHM) filter that allows imaging of the same field-of-view at different wavelengths, without moving any optics. The specific advantages an approach based on this hardware are realized in the speed of acquisition and the alignment of images at different wavelengths. A disadvantage of this device is that there can be fluctuations in the transmission efficiency that depend on characteristics such as temperature, atmospheric pressure and humidity. These fluctuations are significantly larger than fluctuations of properties of standard physical optics in the same conditions and manifest themselves in the amount of light that is transmitted, and hence on the amount of Raman scattered light that is recorded at the detector. Because these fluctuations vary with environmental conditions, they manifest themselves differently at different operating conditions.
Due to these fluctuations, an optical instrument operating in a real-life scenario does not have a perfect or ideal performance for all wavelengths of light. This is true at an optical component level, at an optical system level, or both.
Currently, the state of the art relies on separate measurements of a known material to correct for optical artifacts via software correction. There exists a need for a more rapid system and method that allows for real-time instrument response correction. It would be advantageous if a system and method could provide for instrument response correction without requiring a separate measurement, thereby increasing speed of instrument operation and data generation.