Abstract

In the realm of conventional deep-learning-based pan-sharpening approaches, there has been an ongoing struggle to harmonize the input panchromatic (PAN) and multi-spectral (MS) images across varied channels. Existing methods have often been stymied by spectral distortion and an inadequate texture representation. To address these limitations, we present an innovative constraint-based image generation strategy tailored for the pan-sharpening task. Our method employs a multi-scale conditional invertible neural network, named PSCINN, which is capable of converting the ground truth MS image into a downscaled MS image and a latent variable, all under the guidance of the PAN image. Subsequently, the resampled latent variable, obtained from a prior distribution, and the low-resolution MS image are harnessed to predict the pan-sharpened image in an information-preserving manner, with the PAN image providing essential guidance during the reversion process. Furthermore, we meticulously architect a conditional invertible block to construct a Jacobian Determinant for the spectral information recovery. This structure effectively pre-processes the conditioning PAN image into practical texture information, thereby preventing the spectral information in the pan-sharpened result from potential contamination. The proposed PSCINN outperforms existing state-of-the-art pan-sharpening methodologies, both in terms of objective and subjective results. Post-treatment experiments underscore a substantial enhancement in the perceived quality attributed to our method. The source code for PSCINN will be accessible at https://github.com/jiaming-wang/PSCINN.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.