Abstract
Colour-thresholding digital imaging methods are generally accurate for measuring the percentage of foliar area affected by disease or pests (severity), but they perform poorly when scene illumination and background are not uniform. In this study, six convolutional neural network (CNN) architectures were trained for semantic segmentation in images of individual leaves exhibiting necrotic lesions and/or yellowing, caused by the insect pest coffee leaf miner (CLM), and two fungal diseases: soybean rust (SBR) and wheat tan spot (WTS). All images were manually annotated for three classes: leaf background (B), healthy leaf (H) and injured leaf (I). Precision, recall, and Intersection over Union (IoU) metrics in the test image set were the highest for B, followed by H and I classes, regardless of the architecture. When the pixel-level predictions were used to calculate percent severity, Feature Pyramid Network (FPN), Unet and DeepLabv3+ (Xception) performed the best among the architectures: concordance coefficients were greater than 0.95, 0.96 and 0.98 for CLM, SBR and WTS datasets, respectively, when confronting predictions with the annotated severity. The other three architectures tended to misclassify healthy pixels as injured, leading to overestimation of severity. Results highlight the value of a CNN-based automatic segmentation method to determine the severity on images of foliar diseases obtained under challenging conditions of brightness and background. The accuracy levels of the severity estimated by the FPN, Unet and DeepLabv3 + (Xception) were similar to those obtained by a standard commercial software, which requires adjustment of segmentation parameters and removal of the complex background of the images, tasks that slow down the process.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.