Abstract

This work describes a framework for sensor fusion of navigation data with camera-based 5 DOF relative pose measurements for 6 DOF vehicle motion in an unstructured 3D underwater environment. The fundamental goal of this work is to concurrently estimate online current vehicle position and its past trajectory. This goal is framed within the context of improving mobile robot navigation to support sub-sea science and exploration. Vehicle trajectory is represented by a history of poses in an augmented state Kalman filter. Camera spatial constraints from overlapping imagery provide partial observation of these poses and are used to enforce consistency and provide a mechanism for loop-closure. The multi-sensor camera + navigation framework is shown to have compelling advantages over a camera-only based approach by: 1) improving the robustness of pairwise image registration, 2) setting the free gauge scale, and 3) allowing for a unconnected camera graph topology. Results are shown for a real world data set collected by an autonomous underwater vehicle in an unstructured undersea environment.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.