Abstract
Plant Diseases and Pests are a major challenge in the agriculture sector. An accurate and a faster detection of diseases and pests in plants could help to develop an early treatment technique while substantially reducing economic losses. Recent developments in Deep Neural Networks have allowed researchers to drastically improve the accuracy of object detection and recognition systems. In this paper, we present a deep-learning-based approach to detect diseases and pests in tomato plants using images captured in-place by camera devices with various resolutions. Our goal is to find the more suitable deep-learning architecture for our task. Therefore, we consider three main families of detectors: Faster Region-based Convolutional Neural Network (Faster R-CNN), Region-based Fully Convolutional Network (R-FCN), and Single Shot Multibox Detector (SSD), which for the purpose of this work are called “deep learning meta-architectures”. We combine each of these meta-architectures with “deep feature extractors” such as VGG net and Residual Network (ResNet). We demonstrate the performance of deep meta-architectures and feature extractors, and additionally propose a method for local and global class annotation and data augmentation to increase the accuracy and reduce the number of false positives during training. We train and test our systems end-to-end on our large Tomato Diseases and Pests Dataset, which contains challenging images with diseases and pests, including several inter- and extra-class variations, such as infection status and location in the plant. Experimental results show that our proposed system can effectively recognize nine different types of diseases and pests, with the ability to deal with complex scenarios from a plant’s surrounding area.
Highlights
Crops are affected by a wide variety of diseases and pests, especially in tropical, subtropical, and temperate regions of the world [1]
We focus principally on three recent architectures: Faster Region-Based Convolutional Neural Network (Faster R-Convolutional Neural Networks (CNNs)) [30], Single Shot Multibox Detector (SSD) [31], and Region-based Fully Convolutional Networks (R-FCN) [32]
SSD with Residual Network (ResNet)-50 performs at 82.53% and R-FCN with ResNet-50 as feature extractor achieves a mean AP of 85.98%, which is slightly better than Faster R-CNN overall and is comparable in some classes
Summary
Crops are affected by a wide variety of diseases and pests, especially in tropical, subtropical, and temperate regions of the world [1]. The mean AP for the whole system shows a performance of more than 80% for the best cases, some diseases, such as leaf mold, gray mold, canker, and plague, show a variable performance. Both Faster R-CNN and R-FCN use the same method of Region Proposal Network (RPN) to extract features from the last layer of the CNN, but using different feature extractors as in our experiment, with VGG-16 and ResNet-50 for Faster R-CNN and R-FCN, respectively, shows comparable results and outstanding performance in the more challenging classes. An early estimation of diseases or pests in the plant could avoid several losses in the whole crop; we consider leaf mold, gray mold, canker, and pest as the most complex and main classes due to their high intra-class variation (e.g., infection status, location of the infection in the plant, side of leaf, type of fungus, etc.) and some inter-class similarities, especially in the last state of infection when the plant is already dead
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.