Abstract

When images are rotated and the scale varies or there are similar objects in the images, wrong matching points appear easily in the scale invariant feature transform (SIFT). To address the problem, this paper proposes a SIFT wrong matching points elimination algorithm. The voting mechanism of Generalized Hough Transform (GHT) is introduced to find the rotation and scaling of the image and locate where the template image appears in the scene in order to completely reject unmatched points. Through a discovery that the neighborhood diameter ratio and direction angle difference of correct matching pairs have a quantitative relationship with the image’s rotation and scaling information, we further remove the mismatching points accurately. In order to improve image matching efficiency, a method for finding the optimal scaling level is proposed. A scaling multiple is obtained through training of sample images and applied to all images to be matched. The experimental results demonstrate that the proposed algorithm can eliminate wrong matching points more effectively than the other three commonly used methods. The image matching tests have been conducted on images from the Inria BelgaLogos database. Performance evaluation results show that the proposed method has a higher correct matching rate and higher matching efficiency.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call