Abstract

Autonomous underwater vehicles are applied in diverse fields, namely in tasks that are risky for human beings to perform, as optical inspection for the purpose of structures quality control. Optical sensors are more appealing cost and they supply a larger quantity of data. Lasers can be used to reconstruct structures in three dimensions, along with cameras, which create a faithful representation of the environment. However, in this context a visual approach was used and the paper presents a method that can put together the three-dimensional information that has been harvested over time, combining also RGB information for surface reconstruction. The map construction follows the motion estimated by a odometry method previously selected from the literature. Experiments conducted using real scenario show that the proposed solution is able to provide a reliable map for objects and even the seafloor.

Highlights

  • Nowadays, the acquisition and analysis of ocean information are crucial steps to biological research and, simultaneously, to support different maritime industry applications

  • Namely inspection of structures and tracking moving objects (Galceran and Carreras 2013; Srividya and Shobha 2014), the use of visual information has been investigated. It is important in close range inspection, when the proximity to submerged structures is required (Kocak et al 2008)

  • The image acquisition is a quite complex task due, for example, the attenuation existent in this environment and the shadows created from the artificial illumination, that can induce false motion (Prados Gutiérrez 2013)

Read more

Summary

Introduction

The acquisition and analysis of ocean information are crucial steps to biological research and, simultaneously, to support different maritime industry applications. For the mapping process it is necessary the correct motion estimation, where visual odometry techniques can be used throughout input images from one or multiple cameras (Scaramuzza and Fraundorfer 2011; Fraundorfer and Scaramuzza 2012).

Results
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.