Abstract

With the increasing development of underwater vision sensors, simultaneous localization and mapping (SLAM) based on stereo vision has become a hot topic in the areas of ocean investigation and exploration. In this paper, visual SLAM with a focus on stereo camera system is presented to estimate the motion of autonomous underwater vehicles (AUVs) and build the feature map of surrounding environment in real-time. Feature detection and matching based on Speeded Up Robust Features (SURF) algorithm are implemented in the visual SLAM system. After eliminating the mismatch, we need to compute the stereo matched SURF features' local 3-D coordinates using the disparity values and stereo vision camera's parameters. Visual SLAM is implemented by fusing features coordinates and AUV pose with Extended Kalman Filter (EKF). The system has been verified on raw data gathered from the AUV in the underwater.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call