Abstract
Recently classical deep learning approaches are commonly used to perform spatial and temporal classification especially for Very High Resolution (VHR) images. They learn from existing low resolution or undersized datasets because of the availability and prices of VHR remote sensing images. Thus, they have witnessed a conspicuous success because it is quite challenging to classify high-dimensional multispectral time series data with few labeled samples. It is also difficult to simulate high quality samples having the same features as the real ones. It goes without saying that the introduction of GANs (Generative Adversarial Network) models as an unsupervised learning method, has allowed the extraction of accurate representations of the data via latent codes and backpropagation techniques. However, it is difficult to acquire high-quality samples with unwanted noises and uncontrolled divergences. To generate high-quality multispectral time series samples, a Self-Attention Generative Adversarial Network (SAGAN) is proposed in this work. SAGAN allows attention-driven, long-range dependency modeling for VHR Multispectral time series image generation tasks. Traditional convolutional GANs generate high-resolution details as a function of only points in lower-resolution feature maps. In SAGAN, details can be generated using cues from all feature locations which improves training dynamics. The proposed SAGAN performs better than traditional GANs, boosting the best inception score. The main contribution of this work is the use of one of the new generation of learning techniques, SAGAN, for Times Series VHR Multispectral Image Generation. SAGAN has been recently used only for single image generation.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.