This invention relates generally to Computed Tomography (CT) imaging systems, and more particularly to projection interpolation algorithms and image prediction algorithms for low dose perfusion technology.
CT perfusion is a recent advancement in clinical applications. In such a scan, the organ-of-interest is repeatedly scanned in a cine mode (table remains stationary during the scan) while the contrast medium is injected into the patient and propagated in the blood circulation. By monitoring the contrast uptake of the different part of the organ, the mean transit time (MTT), cerebral blood flow (CBF), cerebral blood volume (CBV), and other parameters can be calculated. These parameters are used to differentiate viable versus non-viable tissues. In the current protocol, the patient is continuously scanned at one-second intervals for 30 to 40 second(s).
It is desirable to reduce the amount of dose to the patient. One known way is to reduce the milliamperage of the x-ray tube current directly. Because a low milliampereage results in higher noise in the projection data, one will get a sequence of noisy images. At least some basic algorithms exist to reduce noise either in projection or image spaces. However, once the noise is introduced into the collected projection data, it is difficult to remove it completely.