The present invention relates generally to the field of medical diagnostic systems, such as imaging systems. More particularly, the invention relates to a technique for storing collected imaging data to a stand alone memory device during data acquisition so as to reduce the amount of memory required by a processor to acquire an entire set of imaging data.
Positrons are positively charged electrons which are emitted by radionuclides which have been prepared using a cyclotron or other device. The radionuclides most often employed in diagnostic imaging are fluorine-18 (18F), carbon-11 (11C), nitrogen-13 (13N), and oxygen-15 (15O). Radionuclides are employed as radioactive tracers called xe2x80x9cradiopharmaceuticalsxe2x80x9d by incorporating them into substances such as glucose or carbon dioxide. One common use for radiopharmaceuticals is in the medical imaging field.
To use a radiopharmaceutical in imaging, the radiopharmaceutical is injected into a patient and accumulates in an organ, vessel or the like, which is to be imaged. It is known that specific radiopharmaceuticals become concentrated within certain organs or, in the case of a vessel, that specific radiopharmeceuticals will not be absorbed by a vessel wall. The process of concentrating often involves processes such as glucose metabolism, fatty acid metabolism and protein synthesis. Hereinafter, in the interest of simplifying this explanation, an organ to be imaged will be referred to generally as an xe2x80x9corgan of interestxe2x80x9d and prior art and the invention will be described with respect to a hypothetical organ of interest.
After the radiopharmaceutical becomes concentrated within an organ of interest and while the radionuclides decay, the radionuclides emit positrons. The positrons travel a very short distance before they encounter an electron and, when the positron encounters an electron, the positron is annihilated and converted into two photons, or gamma rays. This annihilation event is characterized by two features which are pertinent to medical imaging and particularly to medical imaging using photon emission tomography (PET). First, each gamma ray has an energy of essentially 511 keV upon annihilation. Second, the two gamma rays are directed in substantially opposite directions.
In PET imaging, if the general locations of annihilations can be identified in three dimensions, the shape of an organ of interest can be reconstructed for observation. To detect annihilation locations, a PET camera is employed. An exemplary PET camera includes a plurality of detectors and a processor which, among other things, includes coincidence detection circuitry. Each time a 511 keV photon impacts a detector, the detector generates an electronic signal or pulse which is provided to the processor coincidence circuitry.
The coincidence circuitry identifies essentially simultaneous pulse pairs which correspond to detectors which are generally on opposite sides of the imaging area. Thus, a simultaneous pulse pair indicates that an annihilation has occurred on a straight line between an associated pair of detectors. Over an acquisition period of a few minutes millions of annihilations are recorded, each annihilation associated with a particular detector pair. After an acquisition period, recorded annihilation data is used via any of several different well known procedures to construct a three dimensional image of the organ of interest.
PET cameras have been configured in many geometries. Because annihilation data has to be collected from essentially 360 degrees about an organ which is to be imaged, one popular PET camera configuration includes small detectors arranged to form an annular gantry about the imaging arc. In this case data from all required degrees can be collected at the same time, separated into data from different angles about the imaging area and then back projected as different profile type views to form the tomographic image. Unfortunately annular cameras require large numbers of detectors and therefore are extremely expensive which renders annular cameras unsuitable for many applications.
Referring to FIG. 1, another common PET camera configuration 10 includes first and second cameras 12, 14, respectively, each camera 12, 14 including an impact surface 13, 15, respectively, for detecting impacting gamma rays. Each camera 12 and 14 is characterized by a width W across which hardware which can distinguish M different impact locations is arranged. To detect coincident gamma ray pairs, first and second cameras 12 and 14 are positioned a distance D apart and such that surfaces 13 and 15 oppose each other on opposite sides of an imaging area 16 and define a field of view (FOV). With the opposing camera configuration, instead of collecting tomographic data from all angels about imaging area 16 simultaneously as with an annular configuration, during an acquisition session, first and second cameras 12 and 14 are rotated (see arrows 18, 20) about imaging area 16 through approximately 180 degrees, the cameras maintained at different stop angles for short acquisition periods which together comprise the acquisition session.
For the purposes of this explanation the term xe2x80x9cprofile viewxe2x80x9d or simply xe2x80x9cviewxe2x80x9d will be used to describe all annihilation data collected during a data acquisition period which emanates from the imaging area along parallel paths. At each camera position, cameras 12 and 14 collect annihilation data corresponding to several different profile views. A more detailed analysis of FIG. 1 can be used to better understand profile views and how data corresponding to several views is collected at each camera position.
Referring to FIG. 1, an initial camera position angle xcex8o is defined by a line between and perpendicular to impact surfaces 13 and 15. During rotation, a stop angle xcex8s is defined by the angle between the initial position angle xcex8o and the instantaneous line between and perpendicular to impact surfaces 13 and 15. While some systems operate with a continuously changing stop angle xcex8s during data acquisition, unless indicated otherwise and in the interest of simplifying this explanation, it will be assumed that an exemplary system actually stops at different stop angles and only acquires data while stationary.
Referring still to FIG. 1, assuming cameras 12 and 14 are in the initial position illustrated so that stop angle xcex8s is zero degrees, if an annihilation event occurs at the center of imaging area 16 as indicated by point 22, the annihilation event may generate a corresponding gamma ray pair which emanates along virtually any path. However, with cameras 12 and 14 positioned as illustrated, cameras 12 and 14 can only collect generated gamma rays if the rays are directed within an angle range between a maximum negative flight path angle xe2x88x92xcex8m and a maximum positive flight path angle +xcex8m (and within a z-axis plane which is perpendicular to the illustration). In the interest of simplifying this explanation it will be assumed that cameras 12 and 14 are single dimensional (i.e. z=1) and, although range xe2x88x92xcex8m through +xcex8m may span several different ranges, it will be assumed that range xe2x88x92xcex8m through +xcex8m spans 30xc2x0 (i.e. 15xc2x0 on either side of an instantaneous stop angle xcex8s).
Referring also to FIG. 2, assume the annihilation event at point 22 (i.e. a center of imaging area 16) which is being studied generates gamma rays which are directed along the a flight path 50 which is parallel to initial position angle xcex8o. In addition, assume that other annihilation events occur at other positions indicated at points 24, 26, 28, 30 and 32 and that each of those events, like the event at point 22, generates a pair of gamma rays which emanate along flight paths parallel to path 50. Because all of the ray pairs in FIG. 2 are parallel, the pairs together form a profile view of data, in essence indicating what the collected data xe2x80x9cappearsxe2x80x9d like from a view which is perpendicular to the gamma ray flight paths. For the purposes of this explanation the profile view corresponding to the flight path illustrated in FIG. 2 will be referred to as a first profile view.
While some gamma rays are traveling along the paths indicated in FIG. 2 during a data acquisition period, other gamma rays travel along other flight paths. For example, referring also to FIG. 3, annihilation events are indicated at points 34, 36, 38 and 40, each of which causes gamma rays having flight paths which cause rays to impact surfaces 13 and 15 at a projection ray angle xcex8f. Because all of the ray pairs in FIG. 3 are parallel, the pairs together form a profile view of data, in essence indicating what the collected data xe2x80x9cappearsxe2x80x9d like from a view which is perpendicular to the gamma ray flight paths. For the purposes of this explanation the profile view corresponding to the flight path illustrated in FIG. 3 will be referred to as a second profile view.
Thus, annihilation data corresponding to two different profile views is collected simultaneously while cameras 12 and 14 are at the initial position illustrated in FIGS. 2 and 3. In fact, for every angle within range xe2x88x92xcex8m to +xcex8m, data is collected for a separate profile view corresponding to the angle, the number of angles limited only by the ability of camera 12 and 14 hardware to distinguish between angles. For example, when collecting data at the initial position illustrated, data corresponding to one thousand different profile views might be collected.
To distinguish between data from different profile views, collected annihilation data is stored as a function of two coordinates, a first coordinate, a projection ray angle xcex8f, indicating the projection ray path associated with a specific profile view and a second coordinate, a distance R, indicating the location of the projection ray path within the profile view. For example, referring still to FIG. 3, the annihilation event which occurs at point 38 is associated with projection ray angle xcex8f1 which is a distance R1 from imaging area central point 22. Other events illustrated in FIG. 3 are characterized by the same projection ray angle xcex8f1 but different distances R. Similarly, referring to FIG. 2, the annihilation event which occurs at point 30 is characterized by a projection ray angle coordinate xcex8f2 (not illustrated) which is zero and a distance R2.
During data acquisition coincident counts are organized in a processor memory as a set of two-dimensional arrays, one array for each profile view, each array having as one of its dimensions projection ray angle xcex8f and the other dimension distance R.
While annihilation data is collected which corresponds to many different profile views at each stop angle, there is no stop angle at which all data corresponding to a single profile view is collected. For example, referring again to FIG. 2, when cameras 12 and 14 are positioned at the initial stop angle (i.e. xcex8s=0xc2x0), the event at point 30 is detected and data associated therewith is stored in a corresponding coincident count associated with angle xcex8f2 and distance R2. When cameras 12 and 14 are rotated to a different stop angle indicated in phantom and by numerals 12xe2x80x2 and 14xe2x80x2, an annihilation event at point 30 which generates gamma rays having the projection ray path illustrated still impacts both cameras 12xe2x80x2 and 14xe2x80x2 causing an annihilation detection which corresponds to the first profile view discussed above. This annihilation detection or event is added to the coincident count corresponding to angle xcex8f2 and distance R2 as illustrated in FIG. 2.
Referring again to FIG. 1, after data is collected at the initial camera position illustrated, cameras 12 and 14 are rotated through a small angle in a clockwise direction to a second stop angle. For the purposes of this explanation it will be assumed that the stop angle increment in between consecutive stop angles is 2 degrees. Thus, while the initial stop angle corresponds to a zero degree position, the second stop angle corresponds to a 2xc2x0 position, the third stop angle corresponds to a 4 degree position and so on.
It should be apparent that, after cameras 12 and 14 are rotated to the second stop angle, range xe2x88x92xcex8m to xcex8m changes such that, upon commencing data acquisition at the new stop angle, no data is collected at original angle xe2x88x92xcex8m. Similarly, each time the stop angle is changed by clockwise rotation to a new stop angle, angle xe2x88x92xcex8m changes and data is not collected at the previous angle xe2x88x92xcex8m during the next acquisition period. For example, where the range xe2x88x92xcex8m through xcex8m is 30xc2x0, the range xe2x88x92xcex8m through xcex8m is xe2x88x9215xc2x0 through 15xc2x0 during data acquisition at the initial stop angle. At the second stop angle where xcex8s=2xc2x0, range xe2x88x92xcex8m through xcex8m is between xe2x88x9213xc2x0 and 17xc2x0 and so on. During acquisition at the second stop angle data is not collected which corresponds to initial angle xe2x88x92xcex8m (i.e. xe2x88x9215xc2x0) while data is collected which corresponds to angle xe2x88x92xcex8m=xe2x88x9213xc2x0. Hence, more data is collected which corresponds to angle xe2x88x9213xc2x0 than corresponds to angle xe2x88x9215xc2x0.
It should be appreciated that only incomplete data corresponding to projection ray angles between xe2x88x9215xc2x0 and +15xc2x0 is acquired during the first half of an acquisition session. The dearth of data for initial projection ray angles xcex8f between xe2x88x9215xc2x0 and +15xc2x0, if not supplemented, reduces resulting image quality. To complete data between projection ray angles xe2x88x9215xc2x0 and +15xc2x0, additional data is collected at the end of a data acquisition session when the stop angle exceeds 180xe2x88x922xcex8m degrees. In the present case, where range xe2x88x92xcex8m through xcex8m is 30xc2x0, data collection to complete projection ray coincident counts for projection rays having angles between xe2x88x9215xc2x0 and 15xc2x0 begins when the stop angle is 150xc2x0. When the stop angle exceeds 150xc2x0, annihilation events along original angle xe2x88x92xcex8m (i.e. xe2x88x9215xc2x0) are again detected and corresponding coincident counts associated with projection ray angle xe2x88x92xcex8m are increased. This process of supplementing the xe2x88x9215xc2x0 to 15xc2x0 projection ray coincident counts continues through a stop angle equal to 180xe2x88x92xcex8in degrees. In the present case, because stop angle increment xcex8in is 2xc2x0, data acquisition continues through 178xc2x0.
Thus, data corresponding to initial angle xe2x88x92xcex8m and distance R is not completed until after essentially 180xc2x0 of data acquisition. In conventional data acquisition systems the processor which collects coincident count data effectively maintains coincident count data for each possible projection ray (R, xcex8f) during data acquisition and modifies counts for each ray (R, xcex8f)) as data is collected.
Over the course of 180xc2x0 of data acquisition, in histogram space, the effect of the acquisition protocol described above is schematically illustrated in FIG. 4 which shows an exemplary histogram 70 having flight angle xcex8f and stop angle xcex8s along the vertical axis and distance R along the horizontal axis. During a data acquisition, cameras 12 and 14 rotate through stop angles ranging from 0 to 178xc2x0. At each stop angle xcex8s, a diamond shaped region of coincident count data is acquired, each point within a diamond shaped region representing a separate coincident count coordinate (R, xcex8f). For example, in FIG. 4 the diamond shaped region corresponding to data acquisition at the initial stop angle xcex8s (as illustrated in FIG. 1) where xcex8s is 0xc2x0 is identified by numeral 44. As indicated above, at the initial stop angle xcex8s coincident count data is collected which corresponds to gamma ray flight paths having projection ray angles betweenxe2x88x92xcex8m and xcex8m degrees.
For projection ray angles xcex8f equal to angles xe2x88x92xcex8m or xcex8m, while cameras 12 and 14 are at the initial stop angle the only possible value for distance R is zero. Thus, as illustrated in FIG. 4, at angle xe2x88x92xcex8m which is at the top of diamond shaped region 44, region 44 is restricted indicating that data corresponding to a single coincident count is stored for angle xe2x88x92xcex8m at the first stop angle. Similarly, at angle xcex8m which is at the bottom of diamond shaped region 44, region 44 is restricted indicating that data corresponding to a single coincident count is stored for angle xcex8m at the initial stop angle.
Referring to FIGS. 1 and 2, while the cameras are at the initial stop angle xcex8s=0xc2x0, for projection rays having angles xcex8f equal to 0xc2x0 so that associated rays are perpendicular to impact surfaces 13 and 15, data is collected for all distance R values between xe2x88x92W/2 and W/2. Thus, referring also to FIG. 4, at the initial stop angle xcex8s, data corresponding to coincident counts associated with projection rays having angles Øf equal to 0xc2x0 are represented by a horizontal line which bisects region 44 and extends between distance R=xe2x88x92W/2 and distance R=W/2.
For every projection ray angle xcex8f between initial angle xe2x88x92xcex8m and 0xc2x0 there are a number of distances R, the number of distances R depending on how close a specific flight angle xcex8f is to 0xc2x0. The number of R coordinates for a specific projection ray angle xcex8f increases as the projection ray angle xcex8f gets closer to 0xc2x0. Similar comments can be made with respect to the histogram space between 0xc2x0 and xcex8m. Thus, region 44 is diamond shaped because there are a large number of R values at 0xc2x0 angles xcex8f, a single R value at each of angles xcex8f equal to xe2x88x92xcex8m and xcex8m and a linearly decreasing number of R values between the 0xc2x0 angle xcex8f and angles xe2x88x92xcex8m and xcex8m.
Referring still to FIG. 4, at the second stop angle (i.e. a 2 degree stop angle in the present example), a second diamond shaped region 44xe2x80x2 of data is acquired. The shape of region 44xe2x80x2 is identical to the shape of region 44, the only difference being that region 44xe2x80x2 is shifted 2xc2x0 along the vertical axis. At the second stop angle, while many of the projection ray angles for which data is collected are the same as during acquisition at the first stop angle, some of the projection ray angles are different. Specifically, the range of projection ray angle acquisition is (xe2x88x92xcex8m+2xc2x0) through (xcex8m+2xc2x0).
As stop angles are changed to acquire data from different perspectives about imaging area 16, the diamond shaped region is shifted down the vertical axis until the stop angle is equal to 178xc2x0. Because range xe2x88x92xcex8m through xcex8m is 30xc2x0, the range during data acquisition at the final stop angle is between 163xc2x0 and 193xc2x0 and the entire data set illustrated in FIG. 4 extends from xe2x88x9215xc2x0 to 193xc2x0 (i.e. approximately xe2x88x92xcex8m through 180+xcex8m degrees).
One problem with PET imaging systems is that the amount of data which must be acquired during data acquisition is extremely large and therefore a huge memory is required. One solution for reducing memory size is to generate data in a compact histogram form. To this end, referring again to FIG. 4, the symmetrical relationship P(R, xcex8f)=P(xe2x88x92R, xcex8fxc2x1180) between polar coordinate data can be used to store coincident counts corresponding to projection ray angles xcex8f which are less than 0xc2x0 or greater than 180xc2x0, to projection ray coincident counts which are within the 0xc2x0 to 180xc2x0 range. This compacting is represented by arrows 60 and 62 and results in the compact histogram form 72 illustrated in FIG. 4. For example, referring again to FIG. 1, rays along projection ray paths having projection ray angles xcex8f of xe2x88x9215xc2x0 (i.e. the initial xe2x88x92xcex8m angle) are outside the compact histogram form 72. This data can be directed to a histogram address within compact histogram form 72 by changing the sign of distance R and adding 180xc2x0 to angle xe2x88x92xcex8m. The new angle is 165xc2x0 which is within compact form 72.
Unfortunately the compact histogram solution has several shortcomings. First, while the mathematics to convert data which is outside the compact form into data which is inside the compact form is relatively simple to derive, solving the mathematics in real time to form the compact histogram during the acquisition process is impractical for a number of reasons. As an initial matter, the arithmetic required for the conversion is relatively involved due at least in part to the fact that data from outside the compact form must be reflected about the distance R=0 axis which requires a plurality of conditional steps. In addition, although possible, it would be extremely difficult to provide a lookup table for conversion of all coincident detection pair possibilities and stop angles to distance R and projection ray angle xcex8f possibilities. Moreover, as some systems change stop angle xcex8s continuously during data acquisition, even if it where possible to provide a suitable lookup table, it may be impossible to update using a lookup table whenever stop angle xcex8s changes.
A second and perhaps more vexing problem with forming a compact histogram form is that such a form makes it extremely difficult to save any collected data to an inexpensive stand alone storage device which could alleviate some of the burden on the processor""s memory. Storage outside a processor memory during acquisition is difficult because coincident counts for some coordinates (R, xcex8f) must be maintained during an entire acquisition session. For example, as described above, data acquired at the initial stop angle xcex8s is not completed until acquisition at the final stop angle is completed at the end of an acquisition session. Because coincident counts for some coordinates have to be maintained throughout an acquisition session, it is extremely difficult to define an acquisition regime which saves raw data to a stand alone storage device during acquisition.
Solutions to the problems described above have not heretofore included significant remote capabilities. In particular, communication networks, such as, the Internet or private networks, have not been used to provide remote services to such medical diagnostic systems. The advantages of remote services, such as, remote monitoring, remote system control, immediate file access from remote locations, remote file storage and archiving, remote resource pooling, remote recording, remote diagnostics, and remote high speed computations have not heretofore been employed to solve the problems discussed above.
Thus, there is a need for a medical diagnostic system which provides for the advantages of remote services and addresses the problems discussed above. In particular, there is a need for remote storage which provides large memory capacity for the large amount of data acquired with PET imaging systems. Further, there is a need for remote calculation of complicated computations of PET imaging systems. Even further, there is a need for providing PET imaging systems with remote services over a network.
One embodiment relates to a method to be used with an imaging system which includes two opposed cameras mounted for rotation among a plurality of acquisition angles about an imaging axis for acquiring imaging data throughout an arc about the axis, the cameras collecting data corresponding to an acquisition angle range including a plurality of flight path angles at each acquisition angle. The system also includes a processor having a processor memory. The method reduces the amount of processor memory required to acquire imaging data. The method includes the steps of (a) after gathering imaging information at an acquisition angle, for each flight path angle within the acquisition angle range, determining if additional data will be collected during acquisition at the next consecutive Q acquisition angles; (b) where imaging data for a flight path angle will be collected during data acquisition at at least one of the next Q acquisition angles, maintaining imaging data for the flight path angle in the processor memory; and (c) where imaging data for a flight path angle will not be collected during data acquisition at at least one of the next Q acquisition angles, communicating the imaging data to a remote facility to store the imaging data for the flight path angle in a remote storage device via a network and to provide remote services to the imaging system.
Another embodiment relates to an apparatus for use with an imaging system including two opposed cameras mounted for rotation among a plurality of acquisition angles about an imaging axis for acquiring imaging data throughout an arc about the axis, the cameras collecting data corresponding to an acquisition angle range including a plurality of flight path angles at each acquisition angle. The system also includes a processor having a processor memory. The apparatus reduces the amount of processor memory required to acquire imaging data. The apparatus includes a programmed data processor for: (a) after gathering imaging information at an acquisition angle, for each flight path angle within the acquisition angle range, determining if additional data will be collected during acquisition at the next consecutive Q acquisition angles; (b) where imaging data for a flight path angle will be collected during data acquisition at at least one of the next Q acquisition angles, maintaining imaging data for the flight path angle in the processor memory; and (c) where imaging data for a flight path angle will not be collected during data acquisition at at least one of the next Q acquisition angles, communicating the imaging data via a network to a remote facility to store the imaging data for the flight path angle in a remote storage device and to provide remote services to the imaging system.
Another embodiment relates to an imaging system including two opposed cameras mounted for rotation among a plurality of acquisition angles about an imaging axis for acquiring imaging data throughout an arc about the axis, the cameras collecting data corresponding to an acquisition angle range including a plurality of flight path angles at each acquisition angle. The system also includes a processor having a processor memory. The system reduces the amount of processor memory required to acquire imaging data. The system includes a communications module which transmits and receives data for remote services, a remote facility coupled to the communications module via a network, and a programmed data processor coupled to the communications module. The remote facility includes a processing system coupled to a system of databases and communication components. The programmed data processor (a) after gathering imaging information at an acquisition angle, for each flight path angle within the acquisition angle range, determines if additional data will be collected during acquisition at the next consecutive Q acquisition angles; (b) where imaging data for a flight path angle will be collected during data acquisition at at least one of the next Q acquisition angles, maintains imaging data for the flight path angle in the processor memory; and (c) where imaging data for a flight path angle will not be collected during data acquisition at at least one of the next Q acquisition angles, communicates the imaging data via a network to the remote facility, stores the imaging data for the flight path angle in the system of databases, and provides remote services to the imaging system.
Other principle features and advantages of the present invention will become apparent to those skilled in the art upon review of the following drawings, the detailed description, and the appended claims.