Abstract
Since the assessment of wheat diseases (e.g., leaf rust and tan spot) via visual observation is subjective and inefficient, this study focused on developing an automatic, objective, and efficient diagnosis approach. For each plant, color, and color-infrared (CIR) images were collected in a paired mode. An automatic approach based on the image processing technique was developed to crop the paired images to have the same region, after which a developed semiautomatic webtool was used to expedite the dataset creation. The webtool generated the dataset from either image and automatically built the corresponding dataset from the other image. Each image was manually categorized into one of the three groups: control (disease-free), disease light, and disease severity. After the image segmentation, handcrafted features (HFs) were extracted from each format of images, and disease diagnosis results demonstrated that the parallel feature fusion had higher accuracy over features from either type of image. Performance of deep features (DFs) extracted through different deep learning (DL) models (e.g., AlexNet, VGG16, ResNet101, GoogLeNet, and Xception) on wheat disease detection was compared, and those extracted by ResNet101 resulted in the highest accuracy, perhaps because deep layers extracted finer features. In addition, parallel deep feature fusion generated a higher accuracy over DFs from a single-source image. DFs outperformed HFs in wheat disease detection, and the DFs coupled with parallel feature fusion resulted in diagnosis accuracies of 75, 84, and 71% for leaf rust, tan spot, and leaf rust + tan spot, respectively. The methodology developed directly for greenhouse applications, to be used by plant pathologists, breeders, and other users, can be extended to field applications with future tests on field data and model fine-tuning.
Highlights
Wheat (Triticum aestivum L.) is one of the world’s most productive and important crops, which plays a crucial role in food security (Curtis and Halford, 2014; Shewry and Hey, 2015)
With an overall goal of developing and implementing an automated solution to assess greenhouse wheat diseases, this study proposes an innovative methodology of using deep features and their parallel fusion from color and CIR images
This assumption is supported by the results regarding tan spot disease, where the performance of CIR and color images was not significantly different (Figure 6A), as the tan spot symptoms were large and more obvious, identified with a better accuracy
Summary
Wheat (Triticum aestivum L.) is one of the world’s most productive and important crops, which plays a crucial role in food security (Curtis and Halford, 2014; Shewry and Hey, 2015). The two main approaches to manage wheat diseases are breeding disease-resistant varieties and through chemical applications (Kolmer, 1996; Ransom and McMullen, 2008; Figlan et al, 2020). For both approaches, researchers conduct extensive greenhouse work before transferring the most promising materials or treatments to the field for further evaluation. The current approach of wheat disease diagnosis relies on visual observations by well-trained graders. This approach potentially suffers from subjectivity (grader bias), inefficiency (slow speed of observation), inter-grader variation (inconsistent results among different graders), and fatigue (tiresome operation) (Lehmann et al, 2015). An automated, efficient, and objective approach to accurately and quickly diagnose wheat diseases is needed (Luvisi et al, 2016)
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.