Abstract

Image matching plays an essential role in various computer vision applications. Recent researches found that relative positions among a feature point and its local neighbors can be utilized to build a K Nearest Neighbors (KNN) graph to eliminate the matches with geometric inconsistency. However, the existing KNN graph construction method is unstable under viewpoint changes, as the used Euclidean metric cannot accurately reflect the spatial relationship of feature points. In order to solve this problem, this paper proposes a robust image matching algorithm by using local affine regions and Mahalanobis metric. First, feature points from the images are detected not only with the coordinates but also affine regions around them. Next, feature points and affine information is used to build KNN graph for each image under Mahalanobis metric. Finally, the mismatches are eliminated via finding consensus subgraph. Experimental results demonstrate that the proposed algorithm can build robust KNN graph under large viewpoint changes and achieve higher matching accuracy.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call