Abstract

Marine biological resources are abundant, and the reasonable development, research and protection of marine biological resources are of great significance to marine ecological health and economic development. At present, underwater object quantitative detection plays a very important role in marine biological science research, marine species richness survey, and rare species conservation. However, the problems of a large amount of noise in the underwater environment, small object scale, dense biological distribution, and occlusion all increase the detection difficulty. In this paper, a detection algorithm MAD-YOLO (Multiscale Feature Extraction and Attention Feature Fusion Reinforced YOLO for Marine Benthos Detection) is proposed, which is based on improved YOLOv5 is proposed to solve the above problems. To improve the adaptability of the network to the underwater environment, VOVDarkNet is designed as the feature extraction backbone. It uses the intermediate features with different receptive fields to reinforce the ability to extract feature. AFC-PAN is proposed as the feature fusion network so that the network can learn correct feature information and location information of objects at various scales, improving the network's ability to perceive small objects. Label assignment strategy SimOTA and decoupled head are introduced to help the model better handles occlusion and dense distribution problems. Experiments show the MAD-YOLO algorithm increases mAP0.5:0.95 on the URPC2020 dataset from 49.8% to 53.4% compared to the original YOLOv5. Moreover, the advantages of the model are visualized and analyzed by the method of controlling variables in the experimental part. The experiments show that MAD-YOLO is suitable for detecting blurred, dense, and small-scale objects. The model performs well in marine benthos detection tasks and can effectively promote marine life science research and marine engineering implementation. The source code is publicly available at https://github.com/JoeNan1/MAD-YOLO.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call