Abstract

Autonomous Underwater Vehicle (AUV) navigation and control modules often integrate as much sensory data as possible to increase accuracy in estimating pose and velocity. Visual odometry can be a good complement for robot localization when it navigates in challenging underwater scenarios, such as those colonized with seagrass or algae. Thanks to the wide variety of cameras available on the market, their increased performance and moderated cost; this type of sensor now can be used in marine robots. The work proposed in this paper increases the robustness of contemporary feature-based visual odometers for application in such environments by evolving a state-of-the-art approach; ensuring the tracking of visual keypoints that are geometrically and photometrically invariant, highly distinguishable, and exhibit strong repeatability. Experimental results, obtained from visual datasets captured by a stereoscopic camera installed on an AUV while performing different missions in large areas of the Balearic Islands with special ecological interest, show the improvement of this proposal with respect to other approaches already available and its feasibility to be used online.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.