Abstract

Due to the importance of underwater exploration in the development and utilization of deep-sea resources, underwater autonomous operation is more and more important to avoid the dangerous high-pressure deep-sea environment. For underwater autonomous operation, the intelligent computer vision is the most important technology. In an underwater environment, weak illumination and low-quality image enhancement, as a preprocessing procedure, is necessary for underwater vision. In this paper, a combination of max-RGB method and shades of gray method is applied to achieve the enhancement of underwater vision, and then a CNN (Convolutional Neutral Network) method for solving the weakly illuminated problem for underwater images is proposed to train the mapping relationship to obtain the illumination map. After the image processing, a deep CNN method is proposed to perform the underwater detection and classification, according to the characteristics of underwater vision, two improved schemes are applied to modify the deep CNN structure. In the first scheme, a 1∗1 convolution kernel is used on the 26∗26 feature map, and then a downsampling layer is added to resize the output to equal 13∗13. In the second scheme, a downsampling layer is added firstly, and then the convolution layer is inserted in the network, the result is combined with the last output to achieve the detection. Through comparison with the Fast RCNN, Faster RCNN, and the original YOLO V3, scheme 2 is verified to be better in detecting underwater objects. The detection speed is about 50 FPS (Frames per Second), and mAP (mean Average Precision) is about 90%. The program is applied in an underwater robot; the real-time detection results show that the detection and classification are accurate and fast enough to assist the robot to achieve underwater working operation.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.