The existing detection models for stored grain pests have low accuracy and poor generalization ability in fine-grained detection tasks involving numerous species, minor inter-class differences, and significant intra-class variations. This study addresses these issues by constructing a dataset comprising 3227 images of 12 species of grain pests and proposing an enhanced YOLOv7 algorithm for grain pest detection. This algorithm integrates the Convolutional Block Attention Module (CBAM) into the backbone network to enhance the perception of crucial features, thereby improving detection accuracy. Furthermore, the algorithm’s neck network incorporates the convolutional self-attention mechanism, ACmix, which improves algorithm efficiency and target sensitivity, reducing the rate of false negatives. The Efficient Complete Intersection over Union (ECIoU) is employed as the detection box loss function to augment the model’s generalization capabilities. Experimental results demonstrate that employing the proposed algorithm for detecting the 12 grain pests yields an mAP@0.5 and F1-score of 91.9% and 89.6%, respectively. Compared to the original algorithm, this represents an improvement of 6.2 and 3.4 percentage points, and the detection speed remains nearly constant. Moreover, the comprehensive performance of this algorithm surpasses that of Faster RCNN, SSD, other YOLO series, and the Deformable-detr algorithm. This method facilitates high-precision, real-time detection under conditions involving small targets and complex backgrounds, presenting a viable approach for developing intelligent grain pest detection systems.
Read full abstract