Abstract

ABSTRACT Preserving spectral and spatial information in the satellite image fusion problem is one of the essential challenges. This paper presents a GAN-based method for fusing panchromatic and multispectral satellite images. The proposed method utilises the idea of Cycle-GAN with two generators, for spectral and spatial information preservation, based on residual in residual dense block super-resolution architecture. First, generator-1 translates panchromatic and multispectral satellite images to a high-resolution fused image and then preserves details by utilising generator-2. The goal is to reach the spatial and spectral details of panchromatic and multispectral satellite images, respectively. Two discriminators are employed, one for spectral and the other for spatial transformations, and weighted L1 loss is used for both as cycle losses. By leveraging the unique capabilities of the two generators, the proposed method achieves high-quality image fusion results with improved spectral and spatial resolutions that are evaluated over two different datasets. The experimental results demonstrate the effectiveness of the proposed method, with enhancements ranging from approximately 2% to 30% and 0.5% to 35% in WorldView2 and GeoEye-1 images, respectively, in five metrics including PSNR. Moreover, we show significant enhancements of approximately 7% to 50% and 22% to 29% in the same datasets in metrics like SAM and ERGAS.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call