Abstract
To autonomously explore complex underwater environments, it is convenient to develop motion planning strategies that do not depend on prior information. In this publication, we present a robotic exploration algorithm for autonomous underwater vehicles (AUVs) that is able to guide the robot so that it explores an unknown 2-dimensional (2D) environment. The algorithm is built upon view planning (VP) and frontier-based (FB) strategies. Traditional robotic exploration algorithms seek full coverage of the scene with data from only one sensor. If data coverage is required for multiple sensors, multiple exploration missions are required. Our approach has been designed to sense the environment achieving full coverage with data from two sensors in a single exploration mission: occupancy data from the profiling sonar, from which the shape of the environment is perceived, and optical data from the camera, to capture the details of the environment. This saves time and mission costs. The algorithm has been designed to be computationally efficient, so that it can run online in the AUV’s onboard computer. In our approach, the environment is represented using a labeled quadtree occupancy map which, at the same time, is used to generate the viewpoints that guide the exploration. We have tested the algorithm in different environments through numerous experiments, which include sea operations using the Sparus II AUV and its sensor suite.
Highlights
Autonomous underwater vehicles (AUVs) have become a fundamental tool to perform many underwater tasks, such as close inspection of structures [1], near-bottom surveys [2], or intervention [3].The use of AUVs has many advantages over alternative technologies such as remotely operated vehicles (ROVs)
view planning (VP) work in Vidal et al [8] and Vidal et al [9], Williams et al [14] proposed a target reinspection method for AUVs equipped with a synthetic aperture sonar (SAS)
In this work we have presented a 2D frontier-based viewpoint generation algorithm for exploration using AUVs
Summary
Autonomous underwater vehicles (AUVs) have become a fundamental tool to perform many underwater tasks, such as close inspection of structures [1], near-bottom surveys [2], or intervention [3]. We present an algorithm which is capable of guiding an underwater robot to obtain a map of a region of interest. Frontier-based methods guide the exploration by focusing on the regions between known an unknown space This idea was first proposed by Yamauchi [4]. Robotic exploration algorithms based on VP usually use the next-best-view (NBV) approach, where the best viewpoint is planned online according to the current map and robot location. Our algorithm is capable of autonomously guiding an underwater robot to obtain both the occupancy map and the optical data of a region of interest in a single exploration mission.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.