In an imaging system, an optical aberration occurs when light from one point of an object does not converge into a single point after transmission through the system. Optical aberrations can be described by the distortion of the wavefront at the exit pupil of the system. Ideally, light converging to a point should have a spherical wavefront. The wavefront is a locus of points with similar electromagnetic phase. The local direction of energy propagation is normal to the surface of the wavefront. FIG. 1 is a schematic showing an aberrated wavefront relative to an ideal reference wavefront. The exit pupil is the image of the aperture stop from the point of view of the image point. The direction of energy propagation can be described as a ray normal to the local wavefront. The deviation of the aberrated ray relative to an unaberrated ray at the image plane intersection describes the lateral error of the aberrated ray, which is usually associated with blurring and loss of information. As shown in FIG. 1, an aberrated wavefront lags/leads the ideal reference wavefront at different locations across its surface. The amount of this lead or lag is the local phase error. One can also see that the local slope, or tilt of the aberrated wavefront does not match that of the reference wavefront. Because the local direction of energy propagation is normal to the wavefront, this corresponds to rays propagating through the system with non-ideal paths. These aberrated rays are not perfectly coincident at the image plane, causing blurring and a loss of point-to-point image fidelity. Defocus is a special case of aberration, where the radius of curvature of the wavefront is such that the aberrated rays may converge to a point, albeit away from the desired image plane.
In the classic microscope, points in an object plane correspond to points in a detection plane, often sensed by an array detector. The above description of aberration in an imaging system clearly shows how light emanating from a particular object point might be distributed, in the case of aberration over a number of neighboring pixels in the image sensor. In a coherence microscope, a reference reflection causes the information recorded at the sensor to encode information about the phase of the incident light relative to the reference reflection. When a broad bandwidth of light is used to illuminate the coherence microscope, this enables processing which can extract the optical path length between scattering objects and the reference reflection. If each pixel of the image sensor is considered independently, a 3D volume can be constructed. In general the aberrated rays, misplaced on the sensor, distract from the useful information.
Recently it has been shown that the phase information in the original detected data set can be mathematically manipulated to correct for known aberrations in an optical coherence tomography volume [15] and the closely related holoscopy [18,22]. Methods have been described which attempt to iteratively solve for an unknown aberrations, but these methods have been very limited in the precision of the corrected aberration, and are hindered by long execution times for the iterative calculations.
The use of Shack-Hartmann sensor based adaptive optics for wavefront aberration correction is well established in astronomy and microscopy for point like objects to achieve diffraction limited imaging [1-3]. It is currently an active field of research in optical coherence tomography/microscopy (OCT/OCM) [24,25]. Denk et al describe a coherence gated wavefront sensor where the object is illuminated with a single point of focused low coherence light to create an artificial ‘guide star’ and the focusing of the Shack-Hartmann sensor is realized either through a physical lenslet array or by computational method; where the low coherence property of the light allows depth selection in the wavefront measurement (see for example EP Patent No. 1626257 Denk et al. “Method and device for wave-front sensing”). Recently, adaptive optics via pupil segmentation using a spatial light modulator (SLM) was demonstrated in two photon microscopy [5]. The results showed that the sample introduced optical aberrations, due to change in the refractive index with depth in the sample, can be reduced to recover diffraction limited resolution. This can improve the depth of imaging in tissues. Such segmented pupil approach has also been shown with scene based adaptive optics [6]. Recently, Tippie and Fienup demonstrated sub-aperture correlation based phase correction as a post processing technique in the case of synthetic aperture digital off axis holographic imaging of an extended object [7]. This method allows for correction of narrow band interferometric data in a sample in which scatter or reflection from multiple depths can be ignored.
The key to the above recent advancements lies in the availability of phase information. This information has been successfully exploited for implementing digital refocusing techniques in OCT, by measuring the full complex field backscattered from the sample. Current methods rely however on two assumptions: first, that the samples exhibit an isotropic and homogenous structure with respect to its optical properties, and secondly, that the aberrations, if present, are well defined, or accessible. Whereas the first limitation has not been addressed so far, the second issue can be solved either by assuming simple defocus and applying spherical wavefront corrections, or by iteratively optimizing the complex wavefront with a merit function that uses the image sharpness as a metric [14,15].