Field of the Invention
The present invention relates to a technique of processing a medical image captured by a medical image collection device.
Description of the Related Art
In the medical field, a doctor makes diagnosis by displaying, on a monitor, a medical image obtained by capturing an object, and interpreting the displayed medical image. As medical image collection devices (to be referred to as modalities hereinafter) used to capture medical images, an ultrasonic image diagnosis device, a magnetic resonance imaging device (to be referred to as an MRI device hereinafter), an X-ray computed tomography device (to be referred to as an X-ray CT device hereinafter), and the like are known.
It may be difficult to make diagnosis by only observing individual medical images captured by these modalities. To solve this problem, attempts have been made to more correctly make diagnosis by comparing a plurality of types of medical images such as respective medical images captured by a plurality of modalities or those captured at different dates and times.
In order to use a plurality of types of medical images in diagnosis, it is important to identify (associate) lesion portions and the like in respective medical images. Since it is difficult to attain automatic identification by image processing due to the influences of different modalities, deformations of an object, and the like, it is a common practice for an operator such as a doctor to manually (visually) identify lesion portions while observing images. While observing one medical image (to be referred to as a reference image hereinafter), the operator searches another medical image (to be referred to as a target image hereinafter) for a lesion portion corresponding to that in the one medical image, and identifies the lesion portions based on similarities between the shapes of the lesion portions, the appearances of the surrounding portions of the lesion portions, and the like. If a device which presents the medical images has a function of presenting the reference image and the target image side by side, the operator can readily compare the images of the lesion portions, and identify the lesion portions.
Against the backdrop of such demand, an attempt has been made to generate (extract) an image of a cross section (to be referred to as a correspondence cross section hereinafter) corresponding to the imaging cross section of an ultrasonic image being captured in real time based on a three-dimensional medical image (three-dimensional volume data) such as a CT or MRI image captured in advance, thereby presenting the generated image. Note that “corresponding” indicates that the imaging cross section of the ultrasonic image and the cross section on the three-dimensional medical image express almost the same portion of the object. In patent literature 1 (Japanese Patent No. 03871747), for example, the position and orientation of an ultrasonic probe are measured to obtain the relationship between the coordinate system of an ultrasonic image serving as a target image and that of a three-dimensional image serving as a reference image. By assuming that an object is rigid, a cross section obtained by transforming the position and orientation of the imaging cross section of the ultrasonic image to the coordinate system of the reference image is set as a correspondence cross section (the calculated value thereof). An image of the correspondence cross section is extracted and generated from the three-dimensional medical image.
The measurement accuracy of the position and orientation of the ultrasonic probe is not perfect. The shapes and postures of the object at the timings of capturing a three-dimensional medical image and ultrasonic image do not always match. For this reason, processing of calculating a correspondence cross section includes an error. That is, a calculated correspondence cross section may shift from a true correspondence cross section to some extent. However, the display method disclosed in patent literature 1 described above does not consider the error. Depending on the degree of the error, a lesion portion is not displayed on the image of the calculated correspondence cross section in some cases even though the lesion portion has been extracted on the ultrasonic image. As a result, the efficiency of comparison of lesion portions and diagnosis by the operator decreases.