Data having a common purpose is called an “object”. In general, the object comprises a large number of pieces of data, e.g., a still image, a motion image, sound, and a document. In an information search field, such a matching technology is greatly important that two objects are compared with each other and the similarity therebetween is compared and is determined. The matching technology is applied to various fields including an image search, speech search, and document search.
As a conventional matching technology, e.g., technologies disclosed in Patent Documents 1 to 4 are well-known.
According to a matching method disclosed in Patent Document 1, a set of all feature points of one image is compared with a set of all feature points of another image and, then, when both the sets are similar to each other, it is determined that one image is similar to the other image.
With a speaker identifying apparatus disclosed in Patent Document 2, such matching processing is performed that a speech generated by a speaker is spectrum-analyzed and the thus-obtained amount of features is compared with the amount of features registered to a database to identify the speaker.
Further, with a document search method disclosed in Patent Document 3, such matching processing is performed that the appearance frequency of an index term appearing in a document is determined as the amount of features of the document and the document is searched.
With the above-mentioned conventional matching methods, a relatively small amount of features is extracted from an object serving as a matching target, and the similarities of the amount of features are compared with each other. Therefore, a number of variables used is small, calculation processing is simple, and fast processing is possible.
In an information search field, the fast matching processing is required and the matching precision is also needed. The improvement of the matching precision requires the increase in number of feature points and the setting of the amount of features as a high-dimensional vector as much as possible. However, if increasing the number of feature points and the number of dimensions, the calculation processing becomes complicated and this cannot respond to the requirement of the fast processing.
Then, with a matching method disclosed in Patent Document 4, upon displaying “the amount of features of an image” by a mapping point that is mapped on N-dimensional space, element points in two images are mapped to a space filling curve for filling the N-dimensional space, and are further mapped to the one-dimensional space, and the similarities between the two images are estimated by the distance between the two mapped points.
The “amount of features of the image” used for the matching method is a histogram of an image created by using 600 different color bins. That is, with the matching method, the combination of the number of pixels corresponding to the 600 different colors is set as the “amount of features of the image”, and 600-dimensional space is assumed. Further, the number of pixels of colors of the histogram corresponds to the coordinates on the 600-dimensional space, and the histogram (i.e., the “amount of features of the image”) is displayed as the point on the 600-dimensional space. Furthermore, the point is mapped on the space filling curve for filling the 600-dimensional space, thereby mapping the point to the one-dimensional space.
[Patent Document 1]    Japanese Unexamined Patent Application Publication No. 2001-109885
[Patent Document 2]    Japanese Unexamined Patent Application Publication No. 2004-294755
[Patent Document 3]    Japanese Unexamined Patent Application Publication No. 2005-025465
[Patent Document 4]    PCT Japanese Translation Patent Publication No. 2002-524789
[Patent Document 5]    Japanese Unexamined Patent Application Publication No. 2002-150287
[Patent Document 6]    Japanese Unexamined Patent Application Publication No. 2004-312693
[Non-Patent Document 1]    Seichiro KAMATA, February 1997, “View on Information Compression of Grayscale Image Using Hirbert Scanning”, Journal of Institute of Electronics, Information and Communication Engineers, Vol. J80-D-II, No. 2, pp. 426-433