Abstract
Research on the recognition and segmentation of plant diseases in simple environments based on deep learning has achieved relative success. However, under the conditions of a complex environment and a lack of samples, the model has difficulty recognizing disease spots, or its recognition accuracy is too low. This paper is aimed at investigating how to improve the recognition accuracy of the model when the dataset is in a complex environment and lacks samples. First, for the complex environment, this paper uses DeepLabV3+ to segment sugarcane leaves from complex backgrounds; second, focusing on the lack of training images of sugarcane leaves, two data augmentation methods are used in this paper: supervised data augmentation and deep convolutional generative adversarial networks (DCGANs) for data augmentation. MobileNetV3-large, Alexnet, Resnet, and Densenet are trained by comparing the original dataset, original dataset with supervised data augmentation, original dataset with DCGAN augmentation, background-removed dataset, background-removed dataset with supervised data augmentation, and background-removed dataset with DCGAN augmentation. Then, the recognition abilities of the trained models are compared using the same test set. The optimal network selected based on accuracy and training time is MobileNetV3-large. Classification using MobileNetV3-large trained by the original dataset yielded 53.5% accuracy. By removing the background and adding synthetic images produced by the DCGAN, the accuracy increased to 99%.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.