In many fields, imagers are used to examine particles that have been deposited on planar substrates. For example, a microscope can be used to examine blood cells that have been deposited in a thin layer on a glass slide.
In some of these fields, such as cellular astronomy, it is desirable to examine deposited particles in a systematic manner, for example by examining all of the particles within a selected zone (an “examination zone”) on the substrate. See, e.g., Howard M. Shapiro, Cellular Astronomy—A Foreseeable Future in Cytometry, 60A CYTOMETRY PART A 115-124 (2004). Because the examination zone is usually larger than the field of view of the imager, a systematic examination of deposited particles generally requires dividing the examination zone into regions, each no larger than the imager's field of view, and serially imaging those regions.
The systematic imaging of particles under these conditions requires addressing one or more of the following three challenges: first, how to quickly, effectively, and at least semi-automatically focus the imager on the particles in each region prior to acquiring an image (the auto-focus challenge); second, how to combine the regional images to represent some or all of the examination zone (the image registration challenge); and third, in the case of optical imaging systems, how to compensate for chromatic aberration. Each of these challenges is briefly addressed below.
The auto focus challenge. Before regional images are obtained, an imager will typically perform a focusing step to determine an optimal focal plane. The optimal focal plane will often be chosen to coincide with (or come acceptably close to) the average level of the particles to be imaged. It may be necessary or desirable to repeat the focusing step before images of additional regions are acquired, especially when the planar substrate is not sufficiently flat at a microscopic level, causing the optimal focal plane to vary unacceptably from region to region. In optical microscopy, for example, focusing can be achieved by varying the distance between the objective lens and the microscope stage. To achieve this, the stage can be moved in the z direction (orthogonal to the plane of the stage) while the optics remain fixed, or vice versa.
When objects being imaged exhibit low contrast or are viewed under conditions of low resolution, auto-focusing can be especially challenging. For example, because cells are mostly water and are not strongly absorbing in the visible spectrum, they can exhibit low contrast when imaged in an aqueous medium under brightfield conditions without having first been stained by an agent that absorbs visible light. Further, at sufficiently low magnifications, the finite spatial resolution of imaging system components (e.g., of a lens or of a digital detector) can become manifest. For example, at low magnifications, image signals may impinge on too few detector pixels to capture fully the image details. One resolution-limiting phenomenon is known as the “partial volume effect.” This effect occurs where image detail is not appropriately contained within a pixel of a detector and instead “spills over” into one or more neighboring pixels. Image intensity is thereby diluted over these neighboring, partially filled pixels, and both object features and background features partially contribute to the pixel signal. Intensity values in an image are distorted, such that, for example, a small bright object will appear to be larger and dimmer than it should.
Commercial autofocus methods and algorithms can partially or fully automate the focusing step. Even using automated methods, however, the process of focusing on some particles can be challenging and time consuming. These challenges are particularly acute in the context of many cellular astronomy and other screening applications. Such applications are often performed under conditions designed to increase speed or throughput at the expense of image contrast, image resolution, or both. For example, low image magnification can lead to a lack of contrast against background. Lack of contrast makes it difficult for automated focusing algorithms to operate efficiently.
In addition, where high throughput is important, particles, such as cells, are frequently viewed at low, zero, or even negative magnification in order to increase the number of particles within an imager's field of view. Although this has the potential to improve screening times, low magnification decreases the quality (resolution) of image data available to the auto-focuser, making it more difficult and more time consuming for the auto-focuser to determine when an object is in optimal focus. Sometimes, the signal associated with a cell is registered by only a few, or even just one, pixel of a detector, making focusing, including auto-focusing, especially difficult.
Existing methods to address the auto-focus challenge suffer from limitations. For example, the utilization of physical markings in the substrate induces the auto-focuser to focus on the substrate, and not necessarily on the particles (e.g., cells) that are on or above the substrate. This can result in the introduction of out-of-focus artifacts that reduce image quality and signal-to-noise ratio, particularly at lower magnifications or with high depth of field. The use of range-finding methods requires the incorporation of expensive optical sensors and feedback loops (often costing thousands of dollars), and it adds to the overall complexity of the imaging apparatus. Using fluorescence channel methods can lead to potential fluorophore photobleaching or require fluorescence compensation in the event that multiple fluorophores are used, thereby complicating sample preparation and potentially impeding the quantitative analysis of cellular markers.
The image registration challenge. The second challenge with systematic examination of large numbers of particles, such as cells, for example, is the post-acquisition registration of data, information, or images. This is necessary, for example, to create an accurate “panorama” of the examination zone by combining data or images from the various regions. Image registration can also be required when superimposing multiple images of the same region, each corresponding to a different wavelength of light. In multi-color fluorescence spectroscopy, for example, multiple images can be taken, each image capturing target details at a particular wavelength (or color) and these images are then registered and superimposed. The fundamental challenge relates to the alignment, combination, superposition, or mapping of multiple images that represent the targets, where each image has its own coordinate system. Image registration can involve the transformation of each image into a common coordinate set. This common coordinate set can be chosen such that when images are combined, image details that arise from a target at a given location on the substrate are co-localized.
To facilitate this registration step, the regions are often chosen to be slightly smaller than the imager's field of view, so that each image includes not only an image of a full region, but a portion of one or more bordering regions as well. This overlap between images can aid the alignment of multiple images, because corresponding features of an object, appearing in more than one image, can be used as alignment markers. Such markers are known in the art as fiducial markers, fiduciary points, or reference points. Reference points can be incorporated into the substrate (“hard-coded fiduciary points”), or they can be extracted from the sample itself (“soft-coded fiduciary points”).
Existing methods to address the imaging registration step suffer from limitations. Achieving image registration through the use of “soft-coded fiduciary points” requires the acquisition of cellular images to be followed by post-processing to extract patterns within the sample that can be used to align neighboring fields-of-view. As with the process of focusing prior to image acquisition, the process of image registration can be time consuming and computationally complex, particularly when the object to be imaged lacks clearly defined internal landmarks. This is often the case when imaging cells, which are irregular in shape and of low-contrast in many imaging situations, such as in brightfield microscopy, if the cells are unstained and viewed at low magnification, a situation commonly encountered in cellular astronomy. The incorporation of hard-coded fiduciary points into a substrate requires substrate modification, which can be expensive or require complex manufacturing processes. Further, hard-coded fiduciary points within or below the matrix of the substrate would probably not be within the focal plane of the cells, and while those above the matrix of the substrate might be within the focal plane of the cells, they could physically interfere with the deposition of cells onto the surface, decrease the effective area for cells to occupy, or require complex manufacturing (e.g., micropatterning).
The chromatic aberration correction challenge. In optics, chromatic aberration is a distortion wherein optical components (e.g., lens or objective) do not focus all wavelengths of light to the same convergence point. Chromatic aberration can be particularly problematic in some imaging modalities, such as when viewing cells that have been treated with fluorescent labels (e.g., “green” and “red” labels) that bind selectively to specific cell types or subtypes. Some cells will bind only green labels, others will bind only red labels, some will bind neither label, and some will bind both. This can allow determination of a cell's type or subtype by visualizing the color emitted by the label(s) tagging that cell.
In one type of experiment, two successive component images are taken, such as, for example, a first wherein only fluorescence from the green labels is received by a detector, which is used to create a “green image,” and a second wherein only fluorescence from the red labels is received by the detector, which is used to create a “red image.” The green and red images are registered and then overlaid. In this overlay, cells tagged with the green label appear green, cells tagged with the red label appear red, and cells tagged with both labels appear yellow (the combination of red light and green light). From this overlay, cell types and subtypes can be assigned to individual cells based on color.
Proper registration of the green and red images is essential to an accurate assignment of cell types and subtypes in this manner. Registration ensures that image details that arise from the same substrate coordinates are mapped to the same coordinates of the image space, e.g., red and green light that arises from a particular cell is mapped to the same coordinates on the green image and the red image. Accordingly, when properly registered images of a cell tagged with both green and red labels are overlaid, the result is a single yellow dot (a combination of green light and red light). If the green and red images are not properly registered, however, the overlaid image could instead show two dots, one green and the other red, erroneously suggestive of two singly-labeled cells. One challenge, described above, with image registration is correctly aligning images that are taken of different regions of a substrate. Another challenge is aligning multiple images taken of the same region of a substrate, where the extent of chromatic aberration is appreciable.
Chromatic aberration can pose a significant challenge to the registration of images obtained using different wavelengths of light because light emanating from a particular region of the substrate maps to different image coordinates. This aberration requires correction before the images are overlaid. In some applications, the need to correct chromatic aberration before overlaying images is particularly acute. In cellular astronomy, for example, cells are a few pixels big, which is of the order of the chromatic aberration shift. The magnitude of the shift relative to the image size could confound classification of cells based on their fluorescence (e.g., red or green-label) profile.
Prevention or mitigation of chromatic aberration can require a substantial investment in equipment. Correction for chromatic aberration can be challenging for a variety of reasons. For example, it can be difficult even to quantify the extent of chromatic aberration. Identification of image features, e.g., green dots or red dots, that are known to correspond to each other, e.g., relate to the same cell, can be challenging. In addition, the extent of chromatic aberration can depend not only on wavelength but also on position in the x-y plane of the substrate.
In summary, existing methods for autofocusing and image registration are limited in their scope, utility, and/or versatility, particularly under the conditions often encountered in high throughput methods such as cellular astronomy. Accordingly, there exists a need for new and improved methods for quickly and efficiently focusing on particles deposited on planar substrates prior to image acquisition and for quickly and efficiently registering and aligning multiple images post-acquisition. There exists a particular need for methods and techniques that can simultaneously or sequentially achieve both quick and efficient autofocusing and image registration. There exists a further need for such methods that can be extended to correcting or compensating for chromatic aberration, within or between images.