Abstract

Deep learning (DL) methods have achieved impressive performance for pansharpening in recent years. However, because of poor generalization, most DL methods achieve unsatisfactory performance for data acquired by sensors not considered during the training phase and decreased performance for samples at full resolution. To solve this issue, we propose a data augmentation framework for pansharpening neural networks. Specifically, we introduce first a random spatial degradation based on anisotropic Gaussian-shaped modulation transfer functions (MTFs) to increase the generalization with respect to different spatial models and sensors. Then, considering that various sensors have different ground sampling distances (GSDs), we randomly rescale the GSD of the training samples to improve the generalization with respect to spatial resolution. Thanks to this module, the generalization to tests from different sensors and samples at full resolution can easily be achieved. Experimental results demonstrate the effectiveness of the proposed approach with better performance when data for training are decoupled with the ones for testing and comparable performance when training and testing are coupled (i.e., data acquired by the same sensor are considered in the two phases). Besides, performance at full resolution for pansharpening neural networks is improved by the proposed approach. The proposed approach has been integrated into existing pansharpening neural networks showing satisfactory performance for widely used sensors, including, GaoFen-1, QuickBird, WorldView-2, WorldView-3, IKONOS, Spot-7, GeoEye, and PHR1A.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.