Abstract

The conventional method for crop insect detection based on visual judgment of the field is time-consuming, laborious, subjective, and error prone. The early detection and accurate localization of agricultural insect pests can significantly improve the effectiveness of pest control as well as reduce the costs, which has become an urgent demand for crop production. Maize Spodoptera frugiperda is a migratory agricultural pest that has severely decreased the yield of maize, rice, and other kinds of crops worldwide. To monitor the occurrences of maize Spodoptera frugiperda in a timely manner, an end-to-end Spodoptera frugiperda detection model termed the Pest Region-CNN (Pest R-CNN) was proposed based on the Faster Region-CNN (Faster R-CNN) model. Pest R-CNN was carried out according to the feeding traces of maize leaves by Spodoptera frugiperda. The proposed model was trained and validated using high-spatial-resolution red–green–blue (RGB) ortho-images acquired by an unmanned aerial vehicle (UAV). On the basis of the severity of feeding, the degree of Spodoptera frugiperda invasion severity was classified into the four classes of juvenile, minor, moderate, and severe. The degree of severity and specific feed location of S. frugiperda infestation can be determined and depicted in the frame forms using the proposed model. A mean average precision (mAP) of 43.6% was achieved by the proposed model on the test dataset, showing the great potential of deep learning object detection in pest monitoring. Compared with the Faster R-CNN and YOLOv5 model, the detection accuracy of the proposed model increased by 12% and 19%, respectively. Further ablation studies showed the effectives of channel and spatial attention, group convolution, deformable convolution, and the multi-scale aggregation strategy in the aspect of improving the accuracy of detection. The design methods of the object detection architecture could provide reference for other research. This is the first step in applying deep-learning object detection to S. frugiperda feeding trace, enabling the application of high-spatial-resolution RGB images obtained by UAVs to S. frugiperda-infested object detection. The proposed model will be beneficial with respect to S. frugiperda pest stress monitoring to realize precision pest control.

Highlights

  • Spodoptera frugiperda is a polyphagous pest species that spreads rapidly due to its wide adaptability and strong reproductive capacity and seriously affects the yield of major food crops such as maize and rice [1]

  • The mean average precision was used to evaluate the detection accuracy of the proposed model; it is the mean value of the AP values of all classes

  • A novel Pest R-convolutional neural network (CNN) based on the Faster R-CNN was proposed for maize S. frugiperda feed-trace object detection, combining an feature pyramid network (FPN), attention mechanism, deformable convolution, and multi-scale strategy

Read more

Summary

Introduction

Spodoptera frugiperda is a polyphagous pest species that spreads rapidly due to its wide adaptability and strong reproductive capacity and seriously affects the yield of major food crops such as maize and rice [1]. Manual field sampling is generally used to assess plant diseases and insect pests. Agricultural workers or experts with sufficient agronomic knowledge and experience can provide accurate diagnostic results and advice for pest control by observing images of symptomatic crops. It is difficult to quantify the overall distribution and degree of crops affected by pests and diseases [3]. This can lead to problems such as over-spraying, which decreases crop quality and increases production costs [4,5]

Methods
Results
Discussion
Conclusion

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.