Random coincidences are a significant source of noise in, for example, PET. The raw data collected by a PET scanner may be a list of coincidence events representing near-simultaneous detection of annihilation photons by a pair of detectors. Each coincidence event represents a line in space connecting two detectors along which the positron emission occurred. This line may be referred to as the line of response (LOR).
Coincidence events can be sorted into groups of LORs, called sinograms, that represent projection views through the radionuclide distribution within the object being scanned. The sinograms may be sorted by the angle of each view around the axis of the scanner, as well as its tilt with respect to this axis, the latter in the case of 3D acquisitions. A normal PET data set has millions of counts for the whole acquisition, which may include a large component of undesirable scatter and random events. Considerable pre-processing of the data may be required, for example, correction for random coincidences, estimation and subtraction of scattered photons, detector dead-time correction (after the detection of a photon, the detector must “cool down” again) and detector-sensitivity correction (for both inherent detector sensitivity and changes in sensitivity due to angle of incidence).
Consequently, there is considerable interest in developing methods and algorithms that reduce the data variance due to, for example, random coincidence correction. To arrive at the true events in a PET scanner, a subtraction of the random coincidences, measured using a delayed, or offset, coincidence window, from measured prompt coincidence events may be done. However, since random coincidences are statistically uncorrelated with the prompt coincidences, this subtraction will cause a random noise variance to be added to the corrected measured events. The amount of noise added during this correction will be proportional to the noise in the random coincidence estimate used. Thus this subtraction may make methods reducing noise in the estimated random coincidences attractive.
In view of the discussion above, there may be a need to provide a method and a system allowing for reduction of randoms variance. Further, these systems and methods should preferably be easy to implement and/or fast to compute.
It is always sought to improve the image quality in tomography. Consequently, a system and method reducing randoms variance may improve the final image quality in tomography.
Additionally, it may be desirable to provide a system and method that allows for a more accurate and precise reduction of randoms variance. A more accurate and precise reduction of randoms variance may be desirable from an economical and/or technical perspective.