Superimposing a real-time representation of a medical device, such as a catheter or a biopsy needle, tracked by a Medical Positioning X-ray, Magnetic Resonance Imaging (MRI), Positron Emission Tomography (PET) and the like, during a medical procedure, is known in the art. This medical image serves as a map, aiding medical staff, performing a medical procedure, to navigate the medical device within a volume of interest in a body of a patient, subjected to this procedure. In order for that superposition to reflect the true position of the medical device within that volume of interest, it is required to register the coordinate system associated with the MPS with the coordinate system associated with the medical image.
U.S. Pat. No. 6,149,592 to Yanof et al, entitled “Integrated Flouroscopic Image Data, Volumetric Image Data, and Surgical Device Position Data” is directed to a system for integrating a CT scanner, fluoroscopic x-ray device and a mechanical arm type minimally invasive type surgical tool. In one embodiment, mechanical interconnections, between the CT scanner and the fluoroscopic device, provide a fixed and known offset there between. Mechanical interconnection between the surgical tool and the CT scanner measured by resolvers and encoders provide indication of the position and orientation of the surgical tool relative to the CT scanner. Because the fluoroscopic system is also mechanically constrained, the position and orientation of the surgical tool relative to the fluoroscopic system is also known.
In another embodiment, a plurality of transmitters, such as Light Emitting Diodes (LED), are mounted in a fixed and known relationship to the surgical tool or pointer. An array of receivers is mounted in a fixed relationship to the CT scanner. The surgical tool pointer is positioned on a plurality of markers, which are in a fixed relationship to the coordinate systems of the fluoroscopic scanner. Thus, the surgical tool coordinate system and the fluoroscopic scanner coordinate system are readily aligned.
U.S. Pat. No. 6,782,287 to Grzeszczuk et al, entitled “Method and Apparatus for Tracking a Medical Instrument Based on Image Registration” is directed to an apparatus, method and system for tracking a medical instrument, as it is moved in an operating space, by constructing a composite, 3-D rendition of at least a part of the operating space based on an algorithm that registers pre-operative 3-D diagnostic scans of the operating space with real-time, stereo x-ray or radiograph images of the operating space. An x-ray image intensifier, mounted on a C-arm, and the surgical instrument are equipped with emitters defining the local coordinate systems of each of them. The emitters may be LED markers which communicate with a tracking device or position sensor. The position sensor tracks these components within an operating space enabling the coordinate transformations between the various local coordinate systems. Image data acquired by the x-ray camera is used to register a pre-operative CT data set to a reference frame of a patient by taking at least two protocoled fluoroscopic views of the operating space, including a patient target site. These images are then used to compute the C-arm-to-CT registration. With the surgical tool being visible in at least two fluoroscopic views, the tool is then back-projected into the reference frame of the CT data set. The position and orientation of the tool can then be visualized with respect to a 3D image model of the region of interest. The surgical tool can also be tracked externally using the tracking device.
U.S. Pat. No. 6,246,898 to Vesely et al. entitled “Method for Carrying Out a Medical Procedure Using a Three-Dimensional Tracking and Imaging System”, is directed to a system including a 3D tracking module, an imaging modality, a registration module, an instrument (e.g., catheter), reference transducers and mobile transducers. The transducers may be ultrasonic or electromagnetic transducers. The mobile transducers are coupled with the instrument and with the 3D tracking module. The registration module is coupled with the 3D tracking module and with the imaging modality. The 3D tracking module transforms the measurements of the transducers into XYZ coordinates relative to a reference axis, indicating the position of the instrument. A 3D image, representing the position, size and shape of the instrument, based on the 3D coordinates, is constructed. The imaging modality acquires 2D, 3D or 4D image data sets from an imaging source (e.g., MRI, CT, US). The registration module registers the position of the instrument with the spatial coordinates of the image data set by registering features in the image, such as the reference transducers, with their position in the measuring coordinate system (i.e., 3D tracking module coordinate system).
U.S. Patent application publication 2005/0182319 to Glossop entitled “Method and Apparatus for Registration, Verification, and Referencing of Internal Organs”, is directed to a method for registering image information of an anatomical region (image space) with position information of a path within the anatomical region (patient space). One or more images of the anatomical region, are obtained (e.g., CT, PET, MRI). A three dimensional model of the anatomical region is constructed. The position information of the path within the anatomical region is obtained by inserting a registration device into a conduit, while a tracking device simultaneously samples the coordinates of the position indicating element coupled to the registration device. A three dimensional path (“centerline”) of the registration device, in the anatomical region, is determined. The registration device includes at least one position indicating element (e.g., a coil that detects a magnetic field that is emitted by an electromagnetic tracking device). The image coordinate system is registered with the coordinate system of the tracking device, using the 3D image model and the 3D path of the registration device. Thus, it is possible to represent on the image, a graphical representation of an instrument, equipped with a position indicating element. However, in the method directed to by Glossop, there is no guarantee that the three dimensional path, obtained by the tracking device, is indeed the path of the center of the conduit. It may be that the tracking device traced a path close to the edges of the conduit or a sinusoidal path within the conduit. Therefore, the registration between the image coordinate system, with the coordinate system of the tracking device, may be rendered inaccurate.
U.S. Patent Application Publication 2006/0262970, to Boese et al, entitled “Method and Device for Registering 2D Projection Images Relative to a 3D Image Data Record” directs to a method for registering 2D projection images of an object relative to a 3D image data record of the same object. In the method to Boese et al, a pre-operative 3D data is recorded and a 3D feature (e.g., a model of a vessel tree) is extracted. The same 3D feature is recorded in at least two 2D fluoroscopy images from different C-arm angulations). A 3D symbolic reconstruction of the feature is determined from the two 2D fluoroscopy images. The coordinate systems of the 2D images and the 3D data are registered according to the reconstructed 2D feature from the 2D images and the extracted 3D feature from the 3D data.