Abstract
ABSTRACT Pan-sharpening aims to obtain a multi-spectral image of high resolution from inputs of a high spatial resolution panchromatic image and a low spatial resolution multi-spectral image. In recent years, pan-sharpening methods based on supervised learning have achieved superior performance over traditional methods. However, all these supervised pan-sharpening methods rest upon the assumption that performance of model trained on a coarse scale can generalize well on a finer one, which is not always the case. To address this problem, we propose a novel dual-output and cross-scale learning strategy DOCSNet for pan-sharpening. DOCSNet consists of two sub-networks, ReducedNet1 and FullNet2, which are both adapted from simple three convolutional layers and progressively cascaded. ReducedNet1 is first trained on the reduced-scale training set, its parameters are frozen, and then the whole network (fixed ReducedNet1 cascaded with FullNet2) adopts a cross-scale training strategy which involves simultaneously reduced and full resolution training samples. Each sub-network has an output terminal for reduced-scale and target-scale results, respectively. To the best of our knowledge, this is the first attempt to introduce a dual-output architecture to pan-sharpening framework. Extensive experiments on GaoFen-2 and WorldView-3 satellite images demonstrate that DOCSNet outperforms other state-of-the-art pan-sharpening methods in terms of qualitative visual effects and quantitative metrics evaluations.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.