1. Technical Field
The present invention relates generally to a system and method for 3-dimensional (3D) image reconstruction in a spiral scan cone beam computed tomography (CT) imaging system and, more specifically, to a spiral scan cone beam CT system and method that accurately reconstructs an image of a ROI (region of interest) within a long object by removing components associated with data contamination.
2. Description of Related Art
A system employing cone beam geometry has been developed for three-dimensional (3D) computed tomography (CT) imaging that comprises a cone beam x-ray source and a 2D area detector. An object to be imaged is scanned, preferably over a 360 degree angular range and along its entire length, by any one of various methods wherein the position of the area detector is fixed relative to the source, and relative rotational and translational movement between the source and object provides the scanning (irradiation of the object by radiation energy). The cone beam approach for 3D CT has the potential to achieve 3D imaging in both medical and industrial applications with improved speed, as well as improved dose utilization when compared with conventional 3D CT apparatus (i.e., a stack of slices approach obtained using parallel or fan beam x-rays).
As a result of the relative movement of the cone beam source to a plurality of source positions (i.e., xe2x80x9cviewsxe2x80x9d) along the scan path, the detector acquires a corresponding plurality of sequential sets of cone beam projection data (also referred to herein as cone beam data or projection data), each set of cone beam data being representative of x-ray attenuation caused by the object at a respective one of the source positions.
Various methods have been developed for 3D image reconstruction for cone beam x-ray imaging systems. For example, a filtered backprojection (FBP) cone beam image reconstruction technique is described by Kudo, H. and Saito, T., in their article entitled xe2x80x9cDerivation and Implementation of a Cone-Beam Reconstruction Algorithm for Nonplanar Orbitsxe2x80x9d, IEEE Trans.Med, Imag., MI-13 (1994) 196-211.
Briefly, the FBP technique comprises the following steps at each cone beam view (i.e., at each position of the radiation source as it scans about the object, and at which an imaging detector acquires a corresponding set of projection data):
1. Compute a 1-dimensional projection (i.e., line integral) of the measured cone beam image acquired on a detector plane 1 at each of a plurality of angles 2. This step is illustrated by FIG. 1A for a given angle 21 of a plurality of angles 2, where the projection 2 at coordinates (r, 2) comprises the integrated values of the cone beam image 4 on detector plane 1 along a plurality of parallel lines L(r, 2) that are normal to angle 2, each line L being at an incremental distance r from an origin O. Generally, if the detector plane 1 comprises an N by N array of pixels, then the number of angles 2 is typically given by BN/2.
2. Filter each 1D projection in accordance with a d/dr filter, resulting in a new set of values at each of the r, 2 coordinates, such as shown by filtered projection 6 for the angle 21 in FIG. 1A.
3. Normalize the filtered projections with a normalization function M(r, 2). Normalization is needed to take into account the number of times the plane of integration Q(r, 2) which intersects the source position and the line L(r, 2), intersects the scan path, since the data developed at each scan path intersection creates a contribution to the image reconstruction on the plane Q(r, 2).
4. Backproject the filtered projection 6 from each angle 2 into a 2D object space 7 that coincides with the detector plane 1. This step is illustrated by FIG. 1B, wherein lines 8 spread the value from each r, 2 coordinate into 2D space 7 in a direction normal to each 2.
5. Perform a 1D d/dt filtering of the backprojection image formed in 2D space 7 by step 4. The 1D filtering is performed in the direction of the scan path, i.e., along lines 10, where the arrowhead points in the direction of the scan path.
6. Perform a weighted 3D backprojection of the resulting data in 2D space 7 (i.e., from each pixel in the detector) onto a plurality of sample points P in a 3D object volume 12. The density assigned to each point P is weighted by the inverse of the square of the distance between the point and the spatial coordinates of the x-ray source (see Equation (59) of the forenoted Kudo et al article).
The above procedure will be referred to hereinafter as the 6-step process. It is assumed in this process that the entire cone beam image of the object is captured on the detector of the imaging system. Consider a plane Q(r, 2), which intersects the object, formed by the source and the line L(r, 2) on the detector at angle 2 and at a distance r from the origin. Ignoring the function M(r, 2), the operations 1 through 6 compute the contribution to the reconstructed object density on the plane Q(r, 2) from the x-ray data illuminating the plane and its immediate vicinity. Since the 6-step process is detector driven, a contribution from the data illuminating the plane is computed every time the plane intersects the scan path and thus is illuminated by the x-ray beam. Consequently, the function M(r, 2) is used after the filter function in step 2 to normalize the results. Normalization is particularly undesirable since it requires pre-computing and storing a 2D array M(r, 2) for each source position along an imaging scan path. Since there are usually hundreds, if not thousands of source positions, this type of normalization is both computationally intensive and resource (computer memory) expensive.
As well known, and fully described for example in U.S. Pat. No. 5,257,183 entitled: METHOD AND APPARATUS FOR CONVERTING CONE BEAM X-RAY PROJECTION DATA TO PLANAR INTEGRAL AND RECONSTRUCTING A THREE-DIMENSIONAL COMPUTERIZED TOMOGRAPHY (CT) IMAGE OF AN OBJECT, issued Oct. 26, 1993, incorporated herein by reference, one known method of image reconstruction processing generally begins by calculating Radon derivative data from the acquired cone beam data. The Radon derivative data is typically determined by calculating line integrals for a plurality of line segments L drawn in the acquired cone beam data. In the embodiment described in detail in the U.S. Pat. No. 5,257,183 patent, Radon space driven conversion of the derivative data is used to develop an exact image reconstruction of a region-of-interest (ROI) in the object.
A cone beam data masking technique which improves the efficiency of the calculation of the Radon derivative data in such a Radon space driven technique is described in U.S. Pat. No. 5,504,792 entitled METHOD AND SYSTEM FOR MASKING CONE BEAM PROJECTION DATA GENERATED FROM EITHER A REGION OF INTEREST HELICAL SCAN OR A HELICAL SCAN, issued Apr. 2, 1996, also incorporated herein by reference. The masking technique facilitates efficient 3D CT imaging when only the ROI in the object is to be imaged, as is normally the case. In the preferred embodiment described therein, a scanning trajectory is provided about the object, the trajectory including first and second scanning circles positioned proximate the top and bottom edges, respectively, of the ROI, and a spiral scanning path is connected therebetween. The scanning trajectory is then sampled at a plurality of source positions where cone beam energy is emitted toward the ROI. After passing through the ROI the residual energy at each of the source positions is acquired on an area detector as a given one of a plurality of sets of cone beam data. Each set of the cone beam data is then masked so as to remove a portion of the cone beam data that is outside a given sub-section of a projection of the ROI in the object and to retain cone beam projection data that is within the given sub-section. The shape of each mask for a given set of cone beam data is determined by a projection onto the detector of the scan path which is above and below the source position which acquired the given set of cone beam data. The masked (i.e., retained) cone beam data is then processed so as to develop reconstruction data. An exact image of the ROI is developed by combining the reconstruction data from the various source positions which intersect a common integration plane. Hence, the masks are commonly referred to as xe2x80x9cdata-combinationxe2x80x9d masks.
Data-combination masks can also be used to improve the efficiency of the calculation of the derivative data in a detector data driven technique, such as the simplified ramp filter technique described in U.S. Pat. No. 5,881,123 entitled SIMPLIFIED CONE BEAM IMAGE RECONSTRUCTION USING 3D BACKPROJECTION, issued Mar. 9, 1999, also incorporated herein by reference. This technique reconstructs the image using 2D approximation data sets formed by ramp filtering the masked cone beam data in the direction of the projection of a line drawn tangent to the scan path at the source position that acquired that set of cone beam data. Although this technique is less complex than the prior techniques, the reconstructed image is not exact.
In U.S. Pat. No. 5,926,521, entitled EXACT REGION OF INTEREST CONE BEAM IMAGING USING 3D BACKPROJECTION, issued on Jul. 20, 1999, which is commonly assigned and incorporated herein by reference, a technique is introduced which departs from the conventional Radon space driven conversion processing techniques for image reconstruction (such as known by U.S. Pat. Nos. 5,257,183 and 5,463,666), and provides a mechanism to incorporate the technique of data combination for region-of-interest (ROI) reconstruction, with the Kudo et al. image reconstruction processing, thereby providing an image reconstruction technique for a cone beam imaging system that can not only have a spiral scan path, but can also use a short detector, With this technique, instead of division by the function M(r, 2) as done by Kudo et al., the effect of the normalization of the reconstructed object densities is achieved by dividing the x-ray beam coverage of integration plane Q(r, 2) between the various source positions that illuminate the plane without any overlap.
More specifically, this technique comprises a 4 step process:
1. Apply a mask to the set of cone beam projection data acquired by the detector at each of the source positions, so that only specific non-overlapping contributions to the Radon data can be developed from the projection data.
2. Calculate line integral derivatives in the masked data.
3. Perform a 2D backprojection of the derivative data onto an extended height virtual detector.
4. Perform a 3D backprojection of the 2D data from the virtual detector into a 3D object space.
The presence of a detector mask ensures that the contributions developed by processing projection data of the different detectors are unique and non-redundant (FIG. 1 of this disclosure). Accordingly, division by the function M(r, 2), or its equivalent, is no longer needed, which is a significant simplification in the image reconstruction signal processing. However, although step 2, is not complex, it is computationally expensive. More specifically, it comprises calculating a plurality of line integrals L(r, 2) on each set of the masked detector data, to generate sampled 1D projections of the detector data. Line integral derivatives are then computed from the 1D projections by taking the difference between parallel line segments L1 and L2, as shown in mask 20 of FIG. 2 herein. Note that the L1 and L2 line segments are not limited by the boundaries of the mask 20, and therefore their use results in an exact calculation for the derivatives of line integrals L(r, 2). This type of masking is referred to herein as xe2x80x9csoft maskingxe2x80x9d. Additional details of such soft masking can be found in U.S. Pat. No. 5,748,697, incorporated herein by reference. Step 3 backprojects the line integral derivatives onto the extended xe2x80x9cvirtualxe2x80x9d detector. Before the 3D backprojection in step 4, the gradient of the backprojected virtual detector data in the direction of the scan path is calculated, and the result is then backprojected into the 3D object space for reconstruction the ROI of the object. For good image quality, the sampling of the projections and the number of source positions needs to be very fine. Thus, the filter process described in U.S. Pat. No. 5,926,521 is computationally costly.
In the above incorporated U.S. Pat. No. 5,881,123 entitled SIMPLIFIED CONE BEAM IMAGE RECONSTRUCTION USING 3D BACKPROJECTION, a Feldkamp convolution processing simplification (also referred to as ramp filtering) is implemented with the above-described image reconstruction processing, wherein the entire filter process described in U.S. Pat. No. 5,926,521 is replaced with a single step of ramp filtering of the detector data in the direction of the scan path. This simplification is illustrated in FIG. 3, where L, L1xe2x80x2 and L2xe2x80x2 are three closely spaced parallel line segments that are bound by a mask 30, and L is midway between L1xe2x80x2 and L2xe2x80x2. Line segment L is representative of many such line segments formed at various angles in mask 30, and corresponds to the previously described lines L (r, 2) of FIG. 1, which as well known to those skilled in this technology are used for computing Radon derivative data from the cone beam projection data. In the technique described in U.S. Pat. No. 5,881,123, due to the bounding of the line segments L1xe2x80x2 and L2xe2x80x2 by mask 30, the Feldkamp convolution processing simplification (referred to as ramp filtering) is performed as a substitute for the line integral derivative calculations, which filter A processing corresponds to calculation of the Radon derivative of the partial plane defined by the line segment L and the current source position, up to a multiplicative constant.
Although this operation is computationally very fast, it yields only an approximation of the Radon derivative of the partial plane, due to errors that come about due to the xe2x80x9chard maskingxe2x80x9d of the endpoints of line segments L1xe2x80x2 and L2xe2x80x2 by mask 30, as compared to the xe2x80x9csoftxe2x80x9d masking shown in FIG. 2. That is, it incorrectly limits the detector pixel values to those pixels that are in the mask area, and zeros out the detector pixel values that are outside of the mask boundaries, instead of correctly limiting only the line segments L to the mask area (and calculating the line integral derivatives using the unmasked original detector data when appropriate, i.e., near the mask boundaries).
Accordingly, in U.S. Pat. No. 6,018,561, entitled: MASK BOUNDARY CORRECTRION IN A CONE BEAM IMAGING SYSTEM USING SIMPLIFIED FILTERD BACKPROJECTION IMAGE RECONSTRUCTION, which is incorporated herein by reference, the present inventor describes a technique for computing 2D correction data which, when combined with the ramp filtered 2D approximation data sets, yields an exact image reconstruction. As described in greater detail in U.S. Pat. No. 6,018,561, the mathematical difference between hard and soft masking, which involves only detector data around the mask boundaries, is calculated to arrive at an additive correction term.
More specifically, in U.S. Pat. No. 6,018,561, it was determined in that when the spiral scan path extends beyond both ends of the object, exact image reconstruction can be achieved with the following operations:
Step 1: Mask the cone beam data. This step comprises applying a spiral mask (such as shown in FIG. 5) to each set of the projection data so that data inside the boundaries of each mask form a corresponding plurality of masked 2D data sets.
Step 2: 1D Ramp filter cone beam data in interior of mask in the direction of scan path. This step comprises ramp filtering each masked 2D set along a plurality of parallel lines formed therein, to generate a corresponding plurality of filtered 2D data, each filtered 2D data set corresponding to a calculation of a first estimate of Radon derivative data determined from a given set of the 2D cone beam projection data.
Step 3: 2D filter cone beam data on mask boundary. This step comprises generating 2D correction data for each of the first estimates of Radon derivative data by processing portions of the given set of cone beam projection data that are adjacent boundaries of the mask. The result is referred to as the boundary/correction term.
Step 4: Image reconstruction. This step comprises performing a weighted 3D backprojection of the filtered cone beam data from each pixel on the detector onto the 3D object volume. In particular, this step comprises combining each filtered 2D data set and the 2D correction data calculated therefore, via a weighted 3D backprojection protocol into a 3D space, thereby reconstructing a 3D image of the ROI in the object. Each of these steps is described in further detail below and in the above-incorporated U.S. Pat. No. 6,018,561.
The above 4-step algorithm (referred to herein as the xe2x80x9cshort object algorithmxe2x80x9d) provides accurate image reconstruction when the entire object is the ROI (i.e., the spiral scan extends past the ends of the object). There are various objects of interest in medical as well as industrial inspections, however, that are very long, relatively speaking. For example, a patient""s body is a long object. And, in many instances, only a relatively small sectional region of the long object is of interest. Even if the image of the entire object is needed, it can be obtained by stacking up such sectional regions. It is therefore more practical to employ a spiral scan path just big enough to cover the sectional region rather than to cover the entire object. Under such circumstances, the xe2x80x9cshort object algorithmxe2x80x9d described above may not yield the reconstruction of the sectional region because the cone beam data of the overlaying objects xe2x80x9ccontaminatexe2x80x9d the reconstructed ROI. In particular, contamination results from steps 2 and 3 of the short object algorithm, which spread the cone beam data of the overlaying objects to the reconstructed ROI.
FIG. 6 is a schematic diagram that illustrates the principle of data contamination due to overlaying objects. As shown, a ray R, which is emitted from the source to the detector, traverses a portion of the ROI as well as a portion (denoted D) of the long object outside the ROI. To achieve accurate image reconstruction of the ROI, only the data component of ray R data associated with the ROI is desired. Contamination is caused by the data component of ray R associated with the portion D outside the ROI. Therefore, in FIG. 6, the total contamination contained in captured image data of the ROI is due to data that is associated with objects that are located above and below the ROI.
In a spiral scan environment, overlaying objects are xe2x80x9cseenxe2x80x9d by the source positions in the vicinity of both ends of a spiral scan path. When the short object algorithm is used for image reconstruction of a ROI within a long object, the filtering in step 2 has finite spatial support, and thus the contamination it causes only affects a finite volume near both ends of the ROI. However, the spatial support of the filtering in step 3 of the short object algorithm is infinite, and thus the contamination it causes affects the entire ROI. Thus, in the short object algorithm, it is preferable to modify step 3 in a manner that would achieve more accurate ROI reconstruction in the spiral scan of a long object.
The present invention is directed to a system and method that accurately reconstructs an image of a ROI (region of interest) within a long object by removing components associated with data contamination.
In one aspect of the present invention, an image reconstruction method comprises the steps of:
collecting a set of image data along a spiral scan path of a ROI (region of interest) portion of an object;
identifying contaminated data within the collected set of image data, the contaminated data corresponding to image data associated with an object outside the ROI;
reconstructing an image of the ROI using the set of image data less the contaminated data.
In another aspect, the step of identifying comprises identifying a source position near a beginning or end portion of the spiral scan path.
In yet another aspect, the step of identifying comprises identifying a first complete sinusoidal stage near a beginning portion of the ROI and a second complete sinusoidal stage near an ending portion of the ROI, wherein a complete sinusoidal stage comprises source positions scanning an angular range of B.
In another aspect of the present invention, the step of identifying the first and second complete sinusoidal stages comprises projecting the spiral path on a plane in Radon space.
In yet another aspect, the step of identifying contaminated data comprises identifying image data associated with a source position that lies on an incomplete sinusoidal stage.
In another aspect of the present invention, a scanning and data acquisition method for three dimensional (3D) computerized tomographic (CT) imaging of a region-of-interest (ROI) in an object, wherein image reconstruction processing is applied to a plurality of sets of 2D cone beam projection data, each set being acquired by irradiation of the object by energy from a cone beam source that is directed to a 2D detector at a corresponding plurality of scan path source positions, comprises the steps of:
applying a mask to each set of the projection data so that data inside the boundaries of each mask form a corresponding plurality of masked 2D data sets;
ramp filtering each masked 2D data set along a plurality of parallel lines formed therein, to generate a corresponding plurality of filtered 2D data, each filtered 2D data set corresponding to a calculation of a first estimate of Radon derivative data determined from a given set of the 2D cone beam projection data;
generating 2D correction data for each of the first estimates of Radon derivative data by processing portions of the given set of cone beam projection data that are adjacent boundaries of the mask, wherein the step of generating 2D correction data is not performed for cone beam projection data that is associated with objects outside a region of interest; and
combining each filtered 2D data set and the 2D correction data calculated therefore, in a weighted 3D backprojection manner into a 3D space, thereby reconstructing a 3D image of the ROI in the object.
These and other objects, features and advantages of the present invention will be described or become apparent from the following detailed description of preferred embodiments, which is to be read in connection with the accompanying drawings.