Abstract

ABSTRACT Raw multi-source data include multi-channel multi-spectral images as well as single-channel panchromatic images. Raw multispectral images are rich in spectral detail but low in resolution, whereas raw panchromatic images have high resolution but lack spectral information. Pansharpening is the process of using raw panchromatic and multispectral images to obtain highly accurate remote sensing images. Compared with traditional fusion methods, deep learning-based pansharpening methods perform better. Therefore, this study uses an unsupervised fusion algorithm based on an improved generative adversarial network (GAN) and names it the multi-scale detail injection generative adversarial network (MSDI-GAN). This algorithm uses a network structure with two discriminators and a generator, and the spatial and spectral information of the original multi-source data is retained separately. Each convolutional layer of the generator is used to extract multi-scale feature information of the original data, and injects them into the corresponding discriminator layers of the spatial and spectral discriminator. This method can deepen the exchange of gradient information between the generator and discriminators, optimize the actual network training process and accelerate network convergence. Extensive experiments are conducted to evaluate the performance of our algorithm, and to compare with other algorithms. Based on a qualitative visual effects evaluation of the experimental results and a comparison of quantitative evaluation indicators, it is demonstrated that our method achieves superior results in the pansharpening of remote sensing images.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.