Abstract

AbstractSide‐scan sonar is a lightweight acoustic sensor that is frequently deployed on autonomous underwater vehicles (AUVs) to provide high‐resolution seafloor images. However, using side‐scan images to perform simultaneous localization and mapping (SLAM) remains a challenge when there is a lack of 3D bathymetric information and discriminant features in the side‐scan images. To tackle this, the authors propose a feature‐based SLAM framework using side‐scan sonar, which is able to automatically detect and robustly match keypoints between paired side‐scan images. The authors then use the detected correspondences as constraints to optimise the AUV pose trajectory. The proposed method is evaluated on real data collected by a Hugin AUV, using as a ground truth reference both manually‐annotated keypoints and a 3D bathymetry mesh from multibeam echosounder (MBES). Experimental results demonstrate that this approach is able to reduce drifts from the dead‐reckoning system. The framework is made publicly available for the benefit of the community.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call