In the future our resource on land will be over exploited. The exploration of new resources in the ocean is in progress. Mining will be done on the bottom of the sea. The sea is also a big source of renewable energy. Off shore wind parks and tide plants are built. Also, the major world trade is handled over sea routes and several big harbors. All this maritime facilities are getting older and there are effects like corrosion or malfunctions. In general, they need to be inspected frequently. For deep sea applications, security reasons and cost reduction autonomous underwater vehicles (AUVs) will be the first choice.The project “CView” addresses one of these inspection problems, the harbor inspection. But the algorithms, we present in this article can be adapted to many other inspection tasks. One of the main goals in this project is to find cracks or damaged areas at the underwater buildings or to observe critical sections under water with cost effective methods.The platform for developing the guidance algorithms for inspection is the AUV “SeaCat”. This underwater vehicle has a control software system with an user interface for mission planning, a mission control system, a precise navigation system, optimized motor control with an autopilot and sensors for obstacle detection and inspection.For obstacle and inspection target detection, a scanning sonar is used. The sonar images are automatically processed with edge detection and line extraction algorithms to get a simplified environment description, which is used by the guidance methods presented in this article. A pan–tilt enabled sensor head with camera, laser measurement and MBES (Multi Beam Echo Sounder) is used to inspect the detected objects. Additionally, these sensors provide distance information to the inspection object which can be used by the inspection guidance.This article presents methods for inspection using the online information from the vehicle sensors to guide the vehicle efficient and safely. It is also important to handle the interaction between mission planning and execution. During mission planning, the operator will define the type of the inspection object (wall, vessel, sluice, etc.). The algorithms we develop use the information from the mission planning and the online data from the vehicle sensors to guide the vehicle to get the optimal inspection results. Therefore, a precise distance control to the inspection object, collision avoidance and object recognition are needed.
Read full abstract