Abstract

AbstractWheat rust is one of the important factors leading to wheat yield decline. The traditional method of artificial identification of wheat rust remains inefficient. With the development of unmanned technology, unmanned aerial vehicles (UAVs) with advanced deep vision models can be used to monitor wheat growth, hence enabling timely disease detection and deployment of corresponding treatment measures. However, due to the limitation of embedded system hardware, it is a challenge to deploy a high‐accuracy and lightweight wheat rust detection model in an embedded system. In this paper, a training method based on transfer learning and sharpness‐aware minimization (SAM) is proposed to improve the accuracy of lightweight models. Specifically, the initial model is pretrained on an ImageNet dataset, and then the classifier of the model is fine‐tuned on the wheat rust training set; to prevent the risk of overfitting, the SAM method is applied to adjust the global parameters of the model. The experimental results on the Yellow‐Rust‐19 dataset show that the training method can effectively improve the accuracy of the wheat rust detection model, which is 3.56% higher than the existing methods. In addition, the parameter scale and computing resource requirements of four lightweight models are compared. The results indicate that MobileNetV3‐Small can achieve satisfactory accuracy with low requirements for storage and computing resource, and a detection frame rate of 101.36 frames per second in the Raspberry Pi, which is suitable for the unmanned detection of wheat rust.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call