Citrus is one of the important economic crops, with a vast planting area and complex terrain and environmental conditions. The growth cycle of citrus is long, and it is prone to a wide variety of diseases and pests. If the types of diseases and pests cannot be accurately identified in a timely manner to take corresponding control measures, it will seriously affect the yield and quality of citrus. This study aims to improve the detection accuracy of leaf diseases and pests, reduce the computational scale of the model, and enhance its deployability. A lightweight disease and pest detection model based on the improved YOLOv8 is proposed, and a disease and pest dataset considering different environmental conditions is established. Firstly, the convolutional module (Conv) in the neck network of YOLOv8 is replaced by GSConv, and the C2f module is replaced by VoV-GSCSP, forming a Slim-neck architecture, which reduces the computational complexity of the model while maintaining high recognition accuracy. At the same time, the C2f module in the backbone network is replaced by the C2f_EMA module that integrates the EMA efficient multi-scale attention mechanism, enhancing the model's feature extraction ability for leaf diseases and pests in complex environments. Additionally, the original detection head is improved through multi-level channel compression to reduce features along the channel dimension. The SEDS-YOLOv8 model is designed through the above methods. Experimental results show that the model's parameters, computational cost, and memory usage are reduced by 63.5%, 72.83%, and 61.9% respectively. The model's precision, recall, and mean average precision are 97.5%, 96.2%, and 98.5% respectively. In terms of performance, the detection frame rate on mobile devices reaches 358.5 frames per second, and the average inference time for a single leaf disease and pest image is 4.4 ms. This proves that the algorithm can significantly reduce the computational load of the network while maintaining high detection performance, meeting the deployment requirements of mobile and embedded devices. **************** ACKNOWLEDGEMENTS**************** Thanks for the data support provided by National-level Innovation Program Project Fund "Research on Seedling Inspection Robot Technology Based on Multi-source Information Fusion and Deep Network" (No.: 202410451009); Jiangsu Provincial Natural Science Research General Project (No.: 20KJB530008); China Society for Smart Engineering "Research on Intelligent Internet of Things Devices and Control Program Algorithms Based on Multi-source Data Analysis" (No.: ZHGC104432); China Engineering Management Association "Comprehensive Application Research on Intelligent Robots and Intelligent Equipment Based on Big Data and Deep Learning" (No.: GMZY2174); Key Project of National Science and Information Technology Department Research Center National Science and Technology Development Research Plan (No.: KXJS71057); Key Project of National Science and Technology Support Program of Ministry of Agriculture (No.: NYF251050).
Read full abstract