Abstract
Change detection methods using hyperspectral remote sensing can precisely identify differences of the same area at different observing times. However, due to massive spectral bands, current change detection methods are vulnerable to unrelatedspectral and spatial information in hyperspectral images with the stagewise calculation of attention maps. Besides, current change methods arrange hidden change features in a random distribution form, which cannot express a class-oriented discrimination in advance. Moreover, existent deep change methods have not fully considered the hierarchical features’ reuse and the fusion of the encoder–decoder framework. To better handle the mentioned existent problems, the parallel spectral–spatial attention network with feature redistribution loss (TFR-PS2ANet) is proposed. The contributions of this article are summarized as follows: (1) a parallel spectral–spatial attention module (PS2A) is introduced to enhance relevant information and suppress irrelevant information in parallel using spectral and spatial attention maps extracted from the original hyperspectral image patches; (2) the feature redistribution loss function (FRL) is introduced to construct the class-oriented feature distribution, which organizes the change features in advance and improves the discriminative abilities; (3) a two-branch encoder–decoder framework is developed to optimize the hierarchical transfer and change features’ fusion; Extensive experiments were carried out on several real datasets. The results show that the proposed PS2A can enhance significant information effectively and the FRL can optimize the class-oriented feature distribution. The proposed method outperforms most existent change detection methods.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Similar Papers
More From: Remote Sensing
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.