Abstract

Deep learning shows a strong ability in target detection, scene classification, and change detection of remote sensing. However, the training process requires a large number of samples, and the production of most high-quality training samples requires a lot of time and human resources. With the resolution of remote sensing image gradually improved, and the scene information becoming more and more complicated, the sample augmentation methods at this stage have shown obvious defects, such as the loss of sample key information and the lack of diversity in augmented data sets. In order to solve these problems, this article proposes a new augmentation method for remote sensing scene data named Rel-SAGAN based on generative adversarial networks (GAN). First, combined with self-attention mechanism to improve the large-scale feature learning ability and reduce the calculation parameters of GAN. Second, using relativity adversarial loss function to improve the structural stability of GAN. Furthermore, increasing convolution kernel and deeper structural of GAN to improve the ability of global feature extraction and saving the training time. Finally, taking NWPU remote sensing image dataset as experimental data, the effectiveness of the method proposed in this article is verified through ResNet-18. The experimental results show that it can generate more diverse high-resolution remote sensing natural scene images, the overall classification accuracy of augmented training dataset of remote sensing used high-quality generated images selected based on inception score and Frechet inception distance evaluations is improved by more than 3% and the classification accuracy is generally higher than the traditional data augmentation methods.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.