1. Field of the Invention
The present invention relates to an ultrasound diagnosis apparatus, a medical image display apparatus and a medical image displaying method, and more particularly, to an ultrasound diagnosis apparatus, a medical image display apparatus and a medical image displaying method that can generate and display virtual endoscopy (fly-through) image data based on three dimensional (3D) image data (hereinafter, “volume data”) acquired by performing 3D scans over a target organ in an object.
2. Background of the Invention
An ultrasound diagnosis apparatus transmits and receives ultrasound through a plurality of ultrasound transducers installed in a tip portion of an ultrasound probe to and from a diagnosing target region in an object. By simply touching an ultrasound probe to a patient's body surface, image data of the target region is generated. The generated image data can be displayed on a monitor in real time. An ultrasound diagnosis apparatus is widely used as an apparatus for diagnosing the status of various target organs in a patient's body.
Volume data can be acquired by moving one dimension (1D) array transducers in a direction orthogonal to a direction of the array or by using a two-dimensional (2D)-array ultrasound probe. The 2D array ultrasound probe includes a plurality of transducers arranged both in azimuth and the elevation directions. Recently, it has become possible to improve operability of an ultrasound examination by generating 3D image data and multi-planar reconstruction image data (hereinafter, “MPR image data”) by using the volume data acquired in 3D scans over a target organ in an object.
Further, it has recently been proposed to set a virtual viewing point of an observer in a follow organ of the volume data acquired by performing 3D scans on an object in order to examine an inner surface of the follow organ, such as a blood vessel, observed from the viewing point as a virtual endoscopy (“fly-through”) image data (For example, see Japanese Patent Application Publication 2005-110973).
According the proposed method, it becomes possible to generate endoscopy image data based on volume data acquired from outside of an object without inserting an endoscope into a body of the object. Consequently, the virtual endoscopy (“fly-through”) image data can significantly reduce invasive danger to the object during examination. Further, since it becomes possible to freely set a viewing point or a viewing direction to a follow (lumen) organ such as an alimentary canal or a blood vessel, the proposed method can safely perform examinations of thin follow organs in a high accuracy.
However, according to the proposed method, it has become possible to observe only surface status of the lumen organ by using endoscopy image data acquired through an endoscope. Thus, it has been impossible for the proposed method to examine internal tissue status of a target follow organ. Thus, the conventional fly-through method can not accurately grasp infiltration degrees or invasion degrees of an internal status of a diagnosing target wall of a lumen organ. To accurately grasp infiltration degrees or invasion degrees of an internal status of a diagnosing target wall of a lumen organ is very important for a disease stage examination of a malignant tumor.