Abstract
The existing detection models for stored grain pests have low accuracy and poor generalization ability in fine-grained detection tasks involving numerous species, minor inter-class differences, and significant intra-class variations. This study addresses these issues by constructing a dataset comprising 3227 images of 12 species of grain pests and proposing an enhanced YOLOv7 algorithm for grain pest detection. This algorithm integrates the Convolutional Block Attention Module (CBAM) into the backbone network to enhance the perception of crucial features, thereby improving detection accuracy. Furthermore, the algorithm’s neck network incorporates the convolutional self-attention mechanism, ACmix, which improves algorithm efficiency and target sensitivity, reducing the rate of false negatives. The Efficient Complete Intersection over Union (ECIoU) is employed as the detection box loss function to augment the model’s generalization capabilities. Experimental results demonstrate that employing the proposed algorithm for detecting the 12 grain pests yields an mAP@0.5 and F1-score of 91.9% and 89.6%, respectively. Compared to the original algorithm, this represents an improvement of 6.2 and 3.4 percentage points, and the detection speed remains nearly constant. Moreover, the comprehensive performance of this algorithm surpasses that of Faster RCNN, SSD, other YOLO series, and the Deformable-detr algorithm. This method facilitates high-precision, real-time detection under conditions involving small targets and complex backgrounds, presenting a viable approach for developing intelligent grain pest detection systems.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.