Abstract

To improve the applicability and robustness of the three-dimensional tracking method of an augmented reality-aided assembly guiding system for mechanical products, a tracking method based on the combination of point cloud and visual feature is proposed. First, the tracking benchmark coordinate system is defined using a reference model point cloud to determine the position of the virtual assembly guiding information. Then a camera tracking algorithm combining visual feature matching and point cloud alignment is implemented. To obtain enough matching points of visual features in a textureless assembly environment, a novel ORB feature-matching strategy based on the consistency of direction vectors is presented. The experimental results show that the proposed method has good robust stability and tracking accuracy in an assembly environment that lacks both visual and depth features, and it can also achieve good real-time results. Its comprehensive performance is better than the point cloud-based KinectFusion tracking method.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.