Abstract

With the rapid development in the field of marine exploration, underwater robots play an increasingly important role in the military, fisheries, aquaculture, resource protection, etc. The main way for underwater robots to obtain information is underwater target detection, and the development of underwater robots is inseparable from the development of underwater target detection technology. Existing target detection algorithms have achieved good results on land, but are often unsatisfactory in actual underwater target detection. In this paper, based on the principle of plant intelligence and the target detection algorithm YOLOv5, our proposes the PILLO algorithm to adapt to the underwater environment with blurred imaging and insufficient light, and is specially used for underwater target detection. Specifically, the Swin Transformer is improved based on the principles of the multicore plant intelligence model, which then replaces part of the backbone network of the PILLO algorithm. The Shuffle Attention mechanism is improved based on the principles of the plant logical space transfer mechanism and added to the neck network of the PILLO algorithm. The SIOU loss function was improved based on the principle of the plant ecological regulation module and applied to the PILLO algorithm. The experimental results show that the PILLO model can improve the average accuracy (mAP) of underwater target detection by 87.1%, which is better than the common target detection model.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.