Abstract

Sperm detection performance is particularly critical for sperm motility tracking. However, there are a large number of non-sperm objects, sperm occlusion and poorly detailed texture features in semen images, which directly affect the accuracy of sperm detection. To solve the problem of false detection and missed detection in sperm detection, a multi-sperm target detection model, Yolov5s-SA, with an SA attention mechanism is proposed based on the YOLOv5s algorithm. Firstly, a depthwise, separable convolution structure is used to replace the partial convolution of the backbone network, which can ensure stable precision and reduce the number of model parameters. Secondly, a new multi-scale feature fusion module is designed to enhance the perception of feature information to supplement the positional information and high-resolution of the deep feature map. Finally, the SA attention mechanism is integrated into the neck network before the output of the feature map to enhance the correlation between the feature map channels and improve the fine-grained feature fusion ability of YOLOv5s. Experimental results show that compared with various YOLO algorithms, the proposed algorithm improves the detection accuracy and speed to a certain extent. Compared with the YOLOv3, YOLOv3-spp, YOLOv5s and YOLOv5m models, the average accuracy increases by 18.1%, 15.2%, 6.9% and 1.9%, respectively. It can effectively reduce the missed detection of occluded sperm and achieve lightweight and efficient multi-sperm target detection.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.