Abstract
To achieve good remote sensing image scene classification, deep learning models usually require a large number of samples in the training stage. Unfortunately, collecting a large number of training scene images usually involves large acquisition and processing costs. In contrast, after training a generative adversarial network (GAN), scene samples can subsequently be generated automatically by the generator at a low cost. Then, the generated images can be added to the training set. A model with better classification ability will be obtained when these samples include more diverse scene structures and essential features than the original real images. In this letter, we propose the scene images diversity improvement GAN (diversity-GAN). Diversity-GAN has two important advantages. 1) The training process is designed in a progressive manner: the GAN’s generator and discriminator progress from coarse- to fine-resolution scene images. This characteristic can guarantee the diversity of generated samples. In particular, it guarantees the diversity of the structure of the generated scene images. 2) The training progress is controllable: by introducing control parameters, diversity-GAN can directly determine the scene image resolution on which the training process should focus. This characteristic allows diversity-GAN to achieve scene image structure diversity at the coarse-resolution training stage with a few iterations. In the experiments, the UC-Merced and AID data sets are introduced. The results show that the samples generated by diversity-GAN can effectively improve the diversity of the sample set, and these generated samples can grant convolutional neural networks (CNNs) better classification ability in the training stage.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.