Abstract
Most convolutional neural network (CNN) models have various difficulties in identifying crop diseases owing to morphological and physiological changes in crop tissues, and cells. Furthermore, a single crop disease can show different symptoms. Usually, the differences in symptoms between early crop disease and late crop disease stages include the area of disease and color of disease. This also poses additional difficulties for CNN models. Here, we propose a lightweight CNN model called GrapeNet for the identification of different symptom stages for specific grape diseases. The main components of GrapeNet are residual blocks, residual feature fusion blocks (RFFBs), and convolution block attention modules. The residual blocks are used to deepen the network depth and extract rich features. To alleviate the CNN performance degradation associated with a large number of hidden layers, we designed an RFFB module based on the residual block. It fuses the average pooled feature map before the residual block input and the high-dimensional feature maps after the residual block output by a concatenation operation, thereby achieving feature fusion at different depths. In addition, the convolutional block attention module (CBAM) is introduced after each RFFB module to extract valid disease information. The obtained results show that the identification accuracy was determined as 82.99%, 84.01%, 82.74%, 84.77%, 80.96%, 82.74%, 80.96%, 83.76%, and 86.29% for GoogLeNet, Vgg16, ResNet34, DenseNet121, MobileNetV2, MobileNetV3_large, ShuffleNetV2_×1.0, EfficientNetV2_s, and GrapeNet. The GrapeNet model achieved the best classification performance when compared with other classical models. The total number of parameters of the GrapeNet model only included 2.15 million. Compared with DenseNet121, which has the highest accuracy among classical network models, the number of parameters of GrapeNet was reduced by 4.81 million, thereby reducing the training time of GrapeNet by about two times compared with that of DenseNet121. Moreover, the visualization results of Grad-cam indicate that the introduction of CBAM can emphasize disease information and suppress irrelevant information. The overall results suggest that the GrapeNet model is useful for the automatic identification of grape leaf diseases.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.