Abstract

The effective detection of safflower in the field is crucial for implementing automated visual navigation and harvesting systems. Due to the small physical size of safflower clusters, their dense spatial distribution, and the complexity of field scenes, current target detection technologies face several challenges in safflower detection, such as insufficient accuracy and high computational demands. Therefore, this paper introduces an improved safflower target detection model based on YOLOv5, termed Safflower-YOLO (SF-YOLO). This model employs Ghost_conv to replace traditional convolution blocks in the backbone network, significantly enhancing computational efficiency. Furthermore, the CBAM attention mechanism is integrated into the backbone network, and a combined LCIOU+NWD\\documentclass[12pt]{minimal} \\usepackage{amsmath} \\usepackage{wasysym} \\usepackage{amsfonts} \\usepackage{amssymb} \\usepackage{amsbsy} \\usepackage{mathrsfs} \\usepackage{upgreek} \\setlength{\\oddsidemargin}{-69pt} \\begin{document}$$L_{CIOU + NWD}$$\\end{document} loss function is introduced to allow for more precise feature extraction, enhanced adaptive fusion capabilities, and accelerated loss convergence. Anchor boxes, updated through K-means clustering, are used to replace the original anchors, enabling the model to better adapt to the multi-scale information of safflowers in the field. Data augmentation techniques such as Gaussian blur, noise addition, sharpening, and channel shuffling are applied to the dataset to maintain robustness against variations in lighting, noise, and visual angles. Experimental results demonstrate that SF-YOLO surpasses the original YOLOv5s model, with reductions in GFlops and Params from 15.8 to 13.2 G and 7.013 to 5.34 M, respectively, representing decreases of 16.6% and 23.9%. Concurrently, SF-YOLO’s mAP0.5 increases by 1.3%, reaching 95.3%. This work enhances the accuracy of safflower detection in complex agricultural environments, providing a reference for subsequent autonomous visual navigation and automated non-destructive harvesting technologies in safflower operations.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.