Abstract

The Hausdorff distance can be used to measure the similarity of two point sets. In matching the two point sets, one of them is translated, rotated and scaled in order to obtain an optimal matching, which is a computationally intensive process. In this paper, a robust line-feature-based approach for model-based recognition is presented, which can achieve good matching, even in a noisy environment or with the existence of occlusion. We propose three different ways for forming the line segments for a point set. Two of the methods are based on a fixed reference point, which are the centroid and the center of the minimal disk of the point set. The third method is based on the longest possible line segments formed within the point set. The line features extracted based on the respective line segments are insensitive to noise and can be used to determine the rotation and scale of the image point set accurately and reliably, and more importantly, 2D-2D matching algorithm can be adopted. The first 2D-matching is to compute the relative scale and orientation between the two point sets. Having rotated and scaled the image point set, the M-Estimation Hausdorff distance is applied for the second 2D-matching to measure the relative translation between the query point set and the model point set. This 2D-2D matching can greatly reduce the required memory and computation when compared to a 4D-matching. Both the performance and the sensitivity to noise of our algorithms are evaluated using simulated data. Experiments show that our 2D-2D algorithms can yield a high level of performance when determining the scale, the orientation and the similarity of two point sets.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.