Abstract

This paper reports on a system for an autonomous underwater vehicle to perform in situ, multiple session hull inspection using long‐term simultaneous localization and mapping (SLAM). Our method assumes very little a priori knowledge, and it does not require the aid of acoustic beacons for navigation, which is a typical mode of navigation in this type of application. Our system combines recent techniques in underwater saliency‐informed visual SLAM and a method for representing the ship hull surface as a collection of many locally planar surface features. This methodology produces accurate maps that can be constructed in real‐time on consumer‐grade computing hardware. A single‐session SLAM result is initially used as a prior map for later sessions, where the robot automatically merges the multiple surveys into a common hull‐relative reference frame. To perform the relocalization step, we use a particle filter that leverages the locally planar representation of the ship hull surface, and a fast visual descriptor matching algorithm. Finally, we apply the recently developed graph sparsification tool, generic linear constraints, as a way to manage the computational complexity of the SLAM system as the robot accumulates information across multiple sessions. We show results for 20 SLAM sessions for two large vessels over the course of days, months, and even up to three years, with a total path length of approximately 10.2 km.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call