Abstract

Remote sensing image change detection is the key to understanding surface changes. Although the existing change detection methods have achieved good results, some structural details are missing and the detection accuracy needs to be improved. Therefore, we propose an attention-guided multi-scale context aggregation network (AMCA) for remote sensing image change detection. First, we use the fully attentional pyramid module (FAPM) to enhance the deep feature information of the original image. And we introduce the dense feature fusion module (DFFM) to fully fuse the bi-temporal features to obtain the change regions. Second, the introduction of channel-wise cross fusion transformer (CCT) and channel-wise cross attention (CCA) not only can effectively fuse channel features focusing on different semantic patterns, but also bridge the semantic gap between multi-scale features. Next, we use the transformer decoder to map the learned high-level semantic information into the pixel space to refine the original features. In addition, we use the context extraction module (CEM) to obtain the local and global associations of feature maps. Finally, the addition of attention aggregation module (AAM) can effectively combine the feature information at different scales. Extensive experiments on three public change detection datasets show that the proposed method has advantages over other methods in terms of both visual interpretation and quantitative analysis.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call