1. Field
Apparatuses and methods consistent with non-limiting embodiments of the present disclosure relate to non-image based autofocusing in an imaging system.
2. Description of the Related Art
Cellular biology increasingly relies on fully automated systems for efficiently collecting imaging data from multiple samples. Often, the samples are disposed in microplates having a large well capacity, such as ninety-six wells, 348 wells, or an even greater quantity of wells.
High magnification microscopy methods are commonly employed to observe and record images of cells disposed in the wells of the microplate. The cell images may be captured, for example, in a fluorescence mode, a bright field mode, or a phase contrast mode. Thereby, cell growth over time or cell reaction to stimulus over time may be continuously monitored.
A new style of microplate based instrument that combines an imaging mode with an analysis mode, while providing live cell environment for studying cells, is described in Zimenkov et al. (U.S. Pub. 2012/0300194), which illustrates (e.g., FIG. 11A) an imaging part of the instrument, the entire contents of which are incorporated herein by reference.
An important step in obtaining a proper microscopy image is the ability to focus a microscope objective on the sample disposed in the well of the microplate. In general, two methods of focusing the microscope objective are deployed in conventional automated systems: (1) image-based sensing and (2) position sensing, each of such methods having advantages and limitations.
In image-based sensing for focusing the microscope objective, a series of images is captured at various positions of the microscope objective relative to the bottom of the microplate well. That is to say, the microscope objective may be variously positioned closer to the microplate well or farther from the microplate well. Imaging may be performed at the various positions, and an optimal image is selected, for example, based on sharpness of features of interest.
In conventional automated systems, additional hardware is not required to implement the image-based sensing autofocus technique, thus increasing robustness of the automated system while reducing cost. However, a drawback of the image-based sensing autofocus technique is that additional time and additional exposure of the sample to radiation that may, for example, bleach fluorophores or damage living cells.
The position-based sensing for focusing of the microscope objective implements complex hardware (in addition to imaging hardware) to determine a reference reflecting surface, for example a bottom of a well of the microplate with subsequent offset for image capturing or, preferable, a well/fluid interface). Once the reference surface is determined, an image is captured with the microscope objective focused at the reference reflecting surface or some selected, but fixed for the experiment, offset from the reference reflecting surface.
In general, the position-based sensing autofocus technique executes more quickly than the aforementioned image-based sensing autofocus technique. Moreover, radiation employed for the position-based sensing autofocus technique is typically performed at a wavelength higher than radiation wavelengths used for imaging the samples, and therefore the position-based sensing autofocus technique is less likely to damage the sample.
Various hardware and software for implementing the position-based sensing autofocus technique are provided by, for example, Yoneyama et al. (U.S. Pat. No. 7,345,814) and LaPort (U.S. Pat. No. 8,867,180). As discussed therein, the position-based sensing autofocus technique relies upon a narrow beam or radiation—typically a laser, an LED or any other radiation source capable of producing a narrow beam would be acceptable—output via objective to the sample at an off axis position. The position and/or shape of reflected beam is analyzed by a position sensitive detector.
In another example, Li et al., “Autofocus System for Microscope,” Opt. Eng. 41(6), 1289-1294 (Jun. 1, 2002), describes the theoretical background of a laser-based autofocus system for an infinity corrected microscope that uses symmetrical silicon photocell as a detection device.
FIG. 1 illustrates a conceptual layout of the position-based autofocus system described in Li et al.
As illustrated in FIG. 1, an infinity corrected optics layout is presented with objective 3003, tube lens 3010, and position sensitive detector 3012, which may be a symmetrical silicon photocell. A narrow beam of light 3001, typically a laser, LED, or other source, is projected at an offset 3002 relative to an optical axis towards objective 3003. The beam 3001 is reflected by sample surface 3005. When the objective 3003 is focused on this sample surface 3005, the reflected beam reaches the position sensitive detector 3012 at an optical axis position. If the sample surface 3005 is offset from focus by offset 3007, the reflected beam will reach detector at a corresponding distance 3015 from the optical axis, as there is a direct correlation between offset 3007 in the sample space and offset 3015 on the position sensitive detector 3012. The position sensitive detector 3012 provides a signal proportional to the offset 3015. As a result, the autofocus process necessitates changing the position of the objective 3003 to maintain a constant signal from detector 3015 equal to the signal obtained from an optimally focused sample, from imaging quality standpoint.
An example of a commercial laser-based autofocus module is the Nikon Perfect Focus™ system, which is a complex laser-based device that controls focus of the microscope objective in real time. Such system, for example, may be incorporated with a standard microscope or imaging reader similar to that described in Zimenkov et al.
FIG. 2 illustrates conceptual incorporation of a laser-based autofocus module into an imaging reader.
The laser-based autofocus module 3110 may be the Nikon Perfect Focus™ system, and the imaging reader may be the imaging system of Zimenkov et al.
The autofocus module 3110 may be disposed between an epi-fluorescence cube 1210 and an objective 1230. A charge-coupled device (CCD) 3106 may be used as a position sensitive detector. A laser 3102 may be a 870 nm laser and dichroic 3104 that introduces the laser into the objective should allow excitation radiation from epi cubes to a sample and emissions radiation from the sample to imaging detector 1260 to pass unhindered. A short pass dichroic 3104 with cut off wavelength of about 800 nm should reflect the 870 nm laser beam, but will pass all radiation below about 750 nm, namely the range sufficient for most fluorophores.
The optics of the autofocus module may be in line with standard imaging optics, and thus do not interfere with imaging functionality. As illustrated by the darkened portions, autofocus beams emitted to and reflected from the sample do not interfere with lightened lines indicating imaging beams emitted to and reflected from sample. Thus the objective's positioning relative to the sample may be performed in real time without delay or additional movements needed between autofocus and image taking, thereby improving processing speed.
However, additional space is necessary to configure such combined implementation, and additional hardware components are required in addition to hardware supporting the sample imaging functionality.
Another laser-based autofocus system is described in LaPort. As illustrated therein (e.g., FIG. 1), a mirror could be switched into the optical path during autofocus and removed during actual image capturing. The position sensitive detector may be used for autofocus while an image is captured camera.
FIG. 3 illustrates conceptual incorporation of a laser-based autofocus module into an imaging reader.
The laser-based autofocus module 3210 may be the system of LaPort, and the imaging reader may be the imaging system of Zimenkov et al.
The laser 3202 emits a beam towards objective 1230 via movable mirror 3204. After autofocus is performed with the assistance of the position sensitive detector 3206, the mirror 3204 is removed from the optical path and the sample is imaged. The switch from autofocus to image capturing is slower than that illustrated in FIG. 2, though the delay of a light mirror moved in and out of the optical path may be small.
Again, considerable additional space would be required to implement the configuration of FIG. 3, and new components in addition to the hardware supporting the imaging functionality would be required.
The envisioned autofocus systems described above employ a complex set of additional parts to that which is already present in conventional imaging systems. The extra space such systems demand presents a challenge for the designer of a dedicated microplate based imaging system similar to Zimenkov et al., as the microplate size sets the lowest limit on the system's size, but microplate size notwithstanding, implementing a compact unit that can be placed into a laminar flow hood or completely inside an incubator is a key design goal for a live cell microplate based imaging system. Moreover, additional spacing gaps between optical components for a laser autofocus subsystem may be unattractive from an overall design standpoint. Thus, an autofocus system accomplished without extra space in addition to the main sample imaging optical tract would be highly desirable.