Abstract

In practice, acquiring and annotating data in specialized domains can be costly, thereby constraining the performance and applicability of deep learning. Utilizing generative models to synthesize data proves to be an effective augmentation technique. Therefore, this research proposes a diseased leaf generation pipeline to diversify the maize disease datasets. We introduce the Dual-Perception Cycle-Consistent Generative Adversarial Network (DP-CycleGAN). During training, incorporating our proposed Structure Perception (SP) loss and Texture Perception (TP) loss functions. These losses guide the model's attention areas through activation reconstruction and mask mechanisms, thereby improving the overall perceptual quality of the generated images and the realism of the disease lesions. We constructed a maize leaf mixed disease dataset to simulate the complex conditions of real-world disease occurrence. Experimental results show that the DP-CycleGAN generates higher-quality and more realistic diseased leaf images. Compared to CycleGAN and state-of-the-art method, DP-CycleGAN shows a 29.6% and 15.7% reduction in Fréchet Inception Distance (FID) scores and a 125.3% and 61.5% increase in Structural Similarity (SSIM) values, respectively. Simultaneously, by incorporating synthetic data during training, our approach significantly enhances the performance of the recognition model in scenarios of both data abundance and scarcity, with improvement rates exceeding two times those of existing state-of-the-art methods. This contributes to the application of Artificial Intelligence (AI) in agricultural production practices.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.