Computed tomography techniques including X-ray Computed Tomography (CT), single photon emission computed tomography (SPECT), positron emission tomography (PET) are well-established imaging modalities. Conventional image reconstruction in such imaging modalities and others has been performed using a method known as filtered backprojection (MP). More recently, iterative image reconstruction methods have been introduced, with the main motivation being x-ray dose reduction. One goal of the iterative techniques is to lower the dose of radiation experience by a subject being imaged. The lower doses and under sampling (when used, e.g., to reduce dose) provide challenges to compute high contrast and clear images. Another goal of iterative techniques, is to provide high-quality reconstruction from data acquired by advanced detector systems such as photon counting detectors, for which FBP may not provide the desired quality. Recent iterative techniques utilized an image fidelity function to encourage fidelity to the measured data, and an edge-preserving regularization function such as total variation or qGGIVERF that encourages smoothness in image areas with little variation, while recovering sharp boundaries to preserver image resolution.
DeMan et al. U.S. Pat. No. 8,971,599 discloses methods for iterative tomographic reconstruction, Three embodiments are discussed that use a ramp filter as a preconditioner. in a first embodiment, the iteration applies a deconvolution filter and reevaluates to see if completion criteria have been satisfied. This first embodiment uses a change of variables (that is, replacing all instances of the image variable x with another variable z, using the equivalence x=Fz for a filter F) . The approach of the first embodiment is highly similar to the classical formulation. of preconditioning. In a second embodiment, a preconditioner is applied to the simultaneous image update steps of an iterative reconstruction, where the preconditioner approximates the inverse Hessian of the cost function being optimized. In a variation, the preconditioner is applied only to the data, fidelity term (called data fit term in the ‘599 patent) and not to the regularization term. This strategy is intended to accommodate edge preserving priors, which are commonly used in current iterative reconstruction methods. However, by applying the preconditioner in this way, it ceases to be a proper preconditioner. Instead, this approach fundamentally alters the minimization process such that the fixed point of the iteration no longer corresponds to the fixed point of the original update sequence. One drawback of the techniques in the ‘599 patent arises in the common case that statistical weighting matrices are used in the update steps. These weighting matrices degrade the approximation of the ramp filter (preconditioner) as an inverse Hessian operation, and therefore degrade the preconditioning effect. In particular, the presence of the weighting matrix W, which often has large dynamic range and therefore causes the Hesssian of the data fidelity term in the cost function to be poorly conditioned, has a detrimental effect on the rate of convergence of the iterative algorithm, ultimately requiring more iterations to achieve a desired accuracy.