Abstract

The classic 2D SLAM are not good enough in nowadays environment. This report uses virtual machine with Ubuntu based Slam_bot package, based on the RTAB-MAP algorithm and Vision SLAM mapping to simulate the four-wheeled robot to autonomously navigate to the target point in various environments. Also, this report introduces a RGBD-SLAM based algorithm which combines the visual and depth data to process the data collect from the sensors. This robot has many sensors like, lidar sensors, RGB vision camera and odometry sensors. To see how the RTAB-MAP algorithm with RGB-D sensor replace for the 2D SALM. As results, the robot with RGB-D and RTAB-MAP algorithms have very good performance. The results show that the navigation system can complete the navigation and localization in lots of complex situations. However, some problems still exist, the speed of the robot is not fast. This may limit the application of the self-navigation robot to a certain extent, like some emergency occasion.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.