Communication of real-time data between image acquisition equipment, minimally invasive sensing, therapy delivery and monitoring devices, and software applications executing on workstations is a common feature in medical image-guided intervention (IGI) applications. Surgical tools or minimally invasive instruments e.g., needles, catheters, guidewires, imaging probes inserted into the body, and the like can be located relative to the patient's body by using tracking systems that can determine the position and orientation of interventional instruments, and possibly the entire instrument shape and communicate these measurements in real-time to external devices. These systems use a variety of intrinsic detection technologies, e,g, optical camera-based detection of markers embedded on instruments, electromagnetic sensing of miniaturized sensors embedded on instruments placed in an electromagnetic field, optical shape sensing using Fiber Bragg Grating (FBG) or Raleigh scatter concepts, and other technologies. Manufacturers of tracking systems may use different data communication protocols and proprietary interfaces, which slow down the integration of IGI applications in clinical practice. IGI applications have to interface individually with each manufacturer's equipment to acquire tracking information, which increases the re-engineering hurdle and hinders interoperability if multiple tracking systems using one or more intrinsic technologies are simultaneously used in the same intervention. Errors and redundancy may be introduced without a standard interface to IGI applications.
Software applications exist that provide an interface to tracking hardware and include other functionality, such as read/display medical images, point-based registration, etc. However, these applications do not combine or coordinate information from multiple tracking systems.