Abstract
Vision and AHRS (attitude and heading reference system) sensors fusion strategy is prevalent in recent years for the legged robot's SLAM (Simultaneous Localization and Mapping), due to its low cost and effectiveness in the global positioning system. In this paper, a new adaptive estimation algorithm is proposed to achieve the robot SLAM by fusing binocular vision and AHRS sensors. A novel acceleration algorithm for SIFT implementation based on Compute Unified Device Architecture (CUDA) is presented to detect the matching feature points in 2D images. All the steps of SIFT were specifically distributed and implemented by CPU or GPU, according to the step's characteristics to make full use of computational resources. The registration of the 3D feature point cloud is performed by using the iterative closest point (ICP) algorithm. Our GPU-based SIFT implementation can run at the speed of 30 frames per second (fps) on most images with 900 × 750 resolution in the test. Compared to other methods, our algorithm is simple to implement and suitable for parallel processing. It can be easily integrated into mobile robot’s tasks like navigation or object tracking, which need the real-time localization information. Experiments results showed that in the unknown indoor environments, the proposed algorithm`s operation is stable and the positioning accuracy is high.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.