Abstract

Autonomous underwater vehicles have become essential tools for the collection of high-resolution bathymetric and geophysical data. An inertial navigation system, aided by a Doppler velocity log and a surface-mounted ultrashort baseline acoustic positioning system, is generally capable of providing sufficient accuracy at conventional survey scales. However, the accuracy decreases with depth, and the resulting relative positioning across different transects may not be sufficient to resolve features of interest at fine scales. This work presents a method for accurate coregistration of the position of adjacent transects. The approach is based on detecting and matching local features in overlapping multibeam echosounder swaths. The navigational errors for the transects are taken to be described by latent Gaussian processes, observed through these matches. The hyperparameters of the Gaussian process are learned from the data themselves and do neither require tuning of filter parameters nor intimate knowledge of the autonomous system or its sensor configuration. The proposed method is robust to outliers by considering a non-Gaussian observation model. The approach is demonstrated on a data set collected at the Arctic Mid-Ocean Ridge (AMOR). The method can be used to construct high-resolution bathymetric models from repeated passes over the same area or to accurately coregister other sensors such as cameras, subbottom profilers, and magnetometers. The primary contribution of this work is the application of feature-based matching of bathymetry with a robust Gaussian process.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.