In many optical spectroscopy applications, a user wants to measure an absorption signal caused by the presence of a small quantity of a sample analyte. In this case, the fractional change in the amount of light incident on the detector caused by sample absorption may be smaller than intensity fluctuations in the light source, making an accurate absorption measurement impossible. Reducing the intensity of the light source reduces the amplitude of the intensity fluctuations, however it also reduces the magnitude of the absorption signal such that the signal-to-noise ratio of the measurement does not improve. What is needed is a method of reducing background light on the detector without reducing the magnitude of the absorption signal by the same fractional amount. Such a technique would increase the signal-to-noise ratio, making the measurement sensitive to smaller absorption signals that otherwise would have been buried in laser intensity noise.
In the time domain, the width of a single pulse from a mode-locked laser may be less than 1 picosecond (ps), and the time interval between adjacent light pulses (pulse interval) may be on the order of 10 nanoseconds (ns). In the frequency domain, the output of the pulsed light source consists of a large number of evenly spaced frequency components. The initial phases of these frequency components are aligned so that these frequency components cancel each other in the time interval between two adjacent light pulses, such that there is essentially no light between the adjacent light pulses. However, upon passing through a sample, some frequency components of the pulse train which are in the vicinity of transitions of the sample analytes are partially attenuated and/or phase shifted. This interaction with the sample disrupts the careful balance among the many modes in the pulsed laser's frequency spectrum. The result is that a small of amount of light is introduced in the laser beam during the time interval between pulses. Furthermore, the spectrum of this intra-pulse will be determined by the absorption spectrum of the sample analyte. An interferometric method for removing the unabsorbed light present during the laser pulses, which allows one to perform background free spectroscopy on the intra-pulse light, is therefore desired.