Abstract

Due to the harsh and unknown marine environment and the limited diving ability of human beings, underwater robots become an important role in ocean exploration and development. However, the performance of underwater robots is limited by blurred images, low contrast and color deviation, which are resulted from complex underwater imaging environments. The existing mainstream object detection networks perform poorly when applied directly to underwater tasks. Although using a cascaded detector network can get high accuracy, the inference speed is too slow to apply to actual tasks. To address the above problems, this paper proposes a lightweight and accurate one-stage underwater object detection network, called U-ATSS. Firstly, we compressed the backbone of ATSS to significantly reduce the number of network parameters and improve the inference speed without losing the detection accuracy, to achieve lightweight and real-time performance of the underwater object detection network. Then, we propose a plug-and-play receptive field module F-ASPP, which can obtain larger receptive fields and richer spatial information, and optimize the learning rate strategy as well as classification loss function to significantly improve the detection accuracy and convergence speed. We evaluated and compared U-ATSS with other methods on the Kesci Underwater Object Detection Algorithm Competition dataset containing a variety of marine organisms. The experimental results show that U-ATSS not only has obvious lightweight characteristics, but also shows excellent performance and competitiveness in terms of detection accuracy.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.