A feature point arrangement matching device is, for example, included in an image matching device as a function of the image matching device. Patent literature 1 discloses an image matching device that includes a following feature point arrangement matching function. Note that patent literature 1 calculates a coordinate sequence of ordered feature points as a feature point arrangement.
This function is, in principle, composed of a registration image invariant calculation module, a search image invariant calculation module, and an invariant match evaluation module. The registration image invariant calculation module calculates an invariant from a feature point arrangement calculated from a registration image. The search image invariant calculation module calculates an invariant from a feature point arrangement of a search image. The feature point arrangement matching function with such a configuration is, in principle, realizes matching of the feature point arrangements in the following operation. First, a feature vector is calculated in advance from a position coordinate of the feature point included in the feature point arrangement of the registration image in a registration process. Next, a feature vector is calculated from a position coordinate of the feature point included in the feature point arrangement of the search image in a search process. Subsequently, degree of similarity is calculated between the feature vector obtained in the registration process and the feature vector obtained in the search process. Then, the feature point arrangements are matched by evaluating whether the degree of similarity is within a predetermined range.
An image matching device disclosed in patent literature 1 includes a projective affine parameter estimation module, a partial region projective module, a projective image matching module, and an image matching result calculation module. The projective affine parameter estimation module includes three modules that make up the feature point arrangement matching function. This image matching device calculates a geometric transformation parameter from a pair of feature point arrangements in which a pair of feature vectors that are evaluated as a match is calculated in two images to be matched. The image matching device, using the calculated geometric transformation parameter, projects one of the images on the other for each partial region, matches the image for each partial region, integrates matching results for the partial regions, and obtains a final image matching result.
In regard to a generation method of the feature point arrangement, there are other known methods than the one disclosed in patent literature 1. For example, a document image feature quantity generation device disclosed in patent literature 2 can be used.
Further, an image matching device disclosed in patent literature 3 includes a following feature point group matching function. Note that the present invention distinguishes between a feature point group and a feature point arrangement. The term feature point group simply indicates a set of feature points. When the feature points belonging to the feature point group are ordered, the ordered feature point group shall be referred to as the feature point arrangement.
This function is, in principle, composed of two image feature data group generation units and a matching unit. The two image feature data group generation units calculate, from a basic image and a sample image, a pair of two-dimensional vector (V) and a two-dimensional pixel position (R) for a scan point. The two-dimensional vector (V) is a spatial gradient of the average luminance around the scan point that satisfies a predetermined condition in the image. The matching unit, using a relationship between the two-dimensional vector (V) and the two-dimensional pixel position (R) that are calculated respectively by the two image feature data group generation units, performs a voting process to set a high voting value to a particular pixel value when the feature point groups are correctly associated. This feature point group matching function realizes the process of matching the feature point groups by the two image feature data groups calculating the two-dimensional vector (V) and the two-dimensional pixel position (R), and then the operation by the matching unit.
In the technique disclosed by patent literature 3, voting is performed based on the relationship between the two-dimensional vector and the two-dimensional pixel position. Therefore, it is considered to be the technique effective in the case in which the two images to be voted for move in parallel to each other, rotate, or are in a relationship in combination thereof. However, this technique is not suitable, in principle, to the case in which the two images to be voted for are in a relationship of the two-dimensional affine transformation or projective transformation (any transformation that projects a plane on a plane) to each other.
An image matching method disclosed in patent literature 4 calculates the degree of similarity in image matching. This image matching method estimates a geometric transformation parameter based on a matching result for each local region using information of a pair of corresponding local regions. Next, the image matching method corrects the position based on the estimated geometric transformation parameter, performs a relative position voting process, and calculates the degree of similarity in image matching. In the image matching method, local regions are associated instead of feature points as a result of matching. This image matching method can be applied to images, in principle, even when the image is affine transformed or projective transformed as long as the association between the local regions can be obtained appropriately. However, while the image is affine transformed or the projective transformed, the larger the size of the local region, the more the matching accuracy of the local regions deteriorates. Therefore, accuracy in matching the resultant images is considered to deteriorate as well.