Abstract

Highlights An approach using deep learning was proposed for identifying diseased regions in UAS imagery of corn fields with 97.23% testing accuracy using the VGG16 model. Disease types were identified within the diseased regions with a testing accuracy of 98.85% using the VGG16 model. On the diseased leaves, severity was estimated with a testing accuracy of 94.20% using the VGG16 model. Deep Learning models have the potential to bring efficiency and accuracy to field scouting. Abstract. Accurately locating diseased regions, identifying disease types, and estimating disease severity in corn fields are all connected steps for developing an effective disease management system. Traditional disease management that relied on a manual scouting approach was inefficient. Therefore, the research community is working on developing advanced disease management systems using deep learning. However, most of the past studies used public datasets consisting of images with uniform backgrounds acquired under lab conditions to train deep learning models, thus, limiting their use under field conditions. In addition, limited studies have been conducted for in-field corn disease analysis using Unmanned Aerial System (UAS) imagery. Therefore, UAS and handheld imagery sensors were used in this study to acquire corn disease images from fields located at Purdue University’s Agronomy Center for Research and Education (ACRE) in the summer of 2020. A total of 55 UAS flights were conducted over three different corn fields from June 20 through September 29, resulting in a collection of approximately 59,000 images. A novel three-stage approach was proposed by independently training a total of nine image classification models using three neural network architectures, namely: VGG16, ResNet50, and InceptionV3, for locating diseased regions, identifying disease types, and estimating disease severity under field conditions. Diseased regions were first identified accurately in UAS-acquired corn field imagery by a sliding window and deep learning-based image classification, with testing accuracies of up to 97.23%. Diseased region identification was followed by accurately identifying three common corn diseases, namely Northern Leaf Blight (NLB), Gray Leaf Spot (GLS), and Northern Leaf Spot (NLS), within the diseased regions with testing accuracies of up to 98.85%. Finally, the severity of the NLS disease on leaves was estimated with a testing accuracy of up to 94.20%. The VGG16 model achieved the highest testing accuracies for identifying diseased regions in corn fields, identifying corn disease types, and estimating NLS's severity. This study presents promising results for three main elements of a disease management system and could advance traditional scouting by integrating deep learning with UAS imagery. Keywords: Corn Diseases, Datasets, Deep Learning, Disease Identification, Disease Region Location, Image Classification, Severity Estimation, UAS Imagery.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call