Abstract

Change detection is a critical task in remote sensing to monitor the state of the surface on Earth. This field has been dominated by deep learning-based methods recently. Many models that model the temporal-spatial correlation in bitemporal images through the non-local interaction between bitemporal features achieve impressive performance. However, under complex scenes including multiple change types or weakly discriminate objects, they suffer from achieving discriminative fusion of information due to the weak semantic discrimination of the bitemporal representations. Aiming at this problem, a difference-guided aggregation network (DGANet) is proposed, where two key modules are injected, i.e., a difference-guided aggregation module (DGAM) and a weighted metric module (WMM). The bitemporal features in DGAM are aggregated with the guidance of their differences, which focuses on their change relevance and relaxes their semantic distinction. Therefore, the fused features are change-relevant and discriminative. WMM aims to achieve adaptive distance computation between the bitemporal features by dynamic feature attention in different dimensions. It is helpful to suppress the pseudo-changes. Besides, a change magnitude contrastive loss (CMCL) is introduced to employ the dependency of bitemporal pixels in different bitemporal images, which further enhances the representation quality of the model. Meanwhile, it is further extended in this work. The effectiveness of the three improvements is demonstrated by extensive ablation studies. The results on three datasets widely used illustrate that our method achieves satisfactory performance.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call