Optical microscopy is useful for the study of living cells, because of its comparatively weak effect on the intracellular environment and because of the wealth of information that can be extracted from highly developed optical techniques, such as time resolved spectroscopy. Because living cells are three-dimensional (3D) systems that exhibit features on length scales from the atomic to macroscopic and dynamic evolution on time scales over many orders of magnitude, the refinement of optical microscopy to visualize these features and this evolution with ever increasing spatial and temporal resolution remains an area of active interest.
Referring to FIG. 1, a common approach in optical microscopy is to illuminate a sample 102 with light filling the rear pupil 100 of an objective lens 101, and then to collect the light resulting from the interaction of the focused illumination with the sample 102 using either the same lens 101 or a different lens. In widefield microscopy, as shown in FIG. 1, the excitation light 103 from a source 104 impinges on the plane of the rear pupil 100 from a multitude of angles, so that the entire field of view within the sample 102 is illuminated simultaneously. Then, a detector 106 images the collected light 105 representing the signal of interest. Because detection across the entire field of view usually occurs simultaneously, widefield imaging can be quite rapid.
Light 107 emitted near the focal plane 108 of the objective 101 produces a clear, focused image at the detector 106. However, a widefield system also collects light (e.g., 109 and 110) emitted from other planes (e.g., 111 and 112), which can create an out-of-focus background haze in thicker samples 102 and obscure the crisp image data from the focal plane 108.
As shown in FIG. 2, confocal microscopy addresses this issue by: (a) illuminating the objective lens 101 at its rear pupil 100 with a nearly parallel beam of light 200, which, for example, can be created by filtering the excitation light 103 through a pinhole mask 201 and subsequently collimating the light with a lens 205, thereby creating a single, concentrated spot of light 206 within the focal plane 207; and (b) centering a pinhole 208 in an opaque mask 210 at the confocal position in the image plane 207 relative to the focal spot 206. Light 209 emitted from the sample 210 near the focal spot 206 is largely concentrated at the pinhole 208 and is passed to the detector 211. However, light (e.g., 212 and 213) from other focal planes (e.g., 214 and 215) is not well concentrated at the confocal position in the image plane 207 and, therefore, is mostly excluded from passing through the pinhole 208. By scanning the sample 210 and the focal spot 206/detection pinhole 208 combination relative to one another on a point-by-point basis in one, two, or three dimensions, a composite image can be captured by detector 211.
As shown in FIG. 3, the resolution of images created by confocal microscopy techniques (e.g., as shown in FIGS. 3D, 3E, and 3F) are often much sharper than equivalent images created by widefield microscopy techniques (e.g., as shown in FIGS. 3A, 3B, and 3C) due to the significant rejection of out-of-focus background haze that confocal microscopy can achieve. However, confocal microscopy images generally take longer to acquire due to the sequential nature of the point-by-point imaging process.
As shown in FIG. 4, in both widefield microscopy (as shown in FIG. 4A) and confocal microscopy (as shown in FIG. 4B), the light illumination within a sample is not confined to the focal plane 400 or point of interest. Instead, the illumination light in each case extends throughout two solid cones 401 and 402 of incoming and outgoing illumination (relative to the focal plane), where the half-angle of each cone is dictated by the numerical aperture (NA) of the objective lens that focuses the light within the sample. The light within illumination cones 401 and 402 that does not contribute to the focused image increases the possibility of photo-induced damage (also known as “photobleaching”) within a sample.
Fluorescence microscopy can be used for intracellular imaging, and a wide variety of site-specific markers have been developed to yield high-contrast images of features of interest. Photobleaching can be problematic in fluorescence microscopy because many fluorescent marker molecules can undergo only a limited number of absorption/emission cycles before permanently photobleaching and becoming useless for creating a fluorescent image. Therefore, excitation of fluorescent marker molecules throughout the illumination cones 410 and 412 of widefield microscopy and confocal microscopy is wasteful, because only a small fraction of the photon budget for each molecule contributes to the desired signal.
Spatial resolution is usually somewhat superior in confocal microscopy, compared to widefield microscopy because the spatial resolution arises from a convolution of the effective excitation region with the effective detection region dictated by the pinhole 204. As shown in FIG. 5, an objective lens 101 can focus incoming monochromatic light 504 to a focal point 505. Because the light is monochromatic, rays of the focused light can have different wavevectors, k1, kz, and k2, not because their wavelengths are different, but because their directions are different. The sharpness of either the effective excitation region or the effective detecting region created by the lens 101, and hence the resolution of the imaging system, is superior in the plane transverse to the objective axis, êz 500, as compared to any plane along êz because the highest spatial frequency in the plane transverse to the objective axis is:
                                          (                          Δ              ⁢                                                          ⁢                              k                ⊥                                      )                    max                =                                                                        (                                                      k                    1                                    -                                      k                    2                                                  )                            ·                                                e                  ^                                ⊥                                                          =                      2            ⁢            k            ⁢                          NA              n                                                          (        1        )            
whereas the highest spatial frequency in the direction along the objective axis is:(Δkz)max=|(kz−k1)·êz|=k(1−√{square root over (1−(NA/n)2)})  (2)where, for monochromatic light, |km|=k for all values of m, k1 501 and k2 502 are wavevectors at opposite edges of the illumination cone of the objective lens 102, kz=kêz 503, and n is the refractive index in the sample and its surrounding medium. Increased confinement of the excitation and detection regions improves sensitivity as well as resolution of the imaging system, because less spurious signal from effects such as autofluorescence and Rayleigh scattering is generated outside the region of interest, and the remaining spurious signal is more efficiently rejected prior to detection.
Improving the performance of live cell imaging requires consideration of the demanding problem of time resolved three-dimensional imaging of P≧1 independent optical properties (e.g., polarization and/or wavelength at different three-dimensional positions and at different times). This requires binning the measured photons in hypervoxels of P+4 dimensions (where a “hypervoxel” is a multidimensional pixel), which vastly increases the detected flux necessary to generate statistically significant data, compared to static two-dimensional or even three-dimensional imaging. Increasing the resolution in any of the P+4 dimensions demands further subdivision, thereby only exacerbating the problem. Furthermore, the photons within each hypervoxel originate from a 3D volume that decreases rapidly in size with increasing spatial resolution, making it increasingly difficult to generate the requisite flux in a non-invasive manner. Thus, the issues of image resolution, signal intensity, imaging speed, and sensitivity of the sample to damage by the imaging light are interrelated.