Abstract

This paper reports on a method for an autonomous underwater vehicle to perform real-time visual simultaneous localization and mapping (SLAM) on large ship hulls over multiple sessions. Along with a monocular camera, our method uses a piecewise-planar model to explicitly optimize the ship hull surface in our factor-graph framework, and anchor nodes to co-register multiple surveys. To enable realtime performance for long-term SLAM, we use the recent Generic Linear Constraints (GLC) framework to sparsify our factor-graph. This paper analyzes how our single-session SLAM techniques can be used in the GLC framework, and describes a particle filter reacquisition algorithm so that an underwater session can be automatically re-localized to a previously built SLAM graph. We provide real-world experimental results involving automated ship hull inspection, and show that our localization filter out-performs Fast Appearance-Based Mapping (FAB-MAP), a popular place-recognition system. Using our approach, we can automatically align surveys that were taken days, months, and even years apart.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call