Abstract

Building change detection (BCD) plays a crucial role in urban planning and development and has received extensive attention. However, existing deep learning-based change detection methods suffer from limited accuracy, mainly due to the information loss and inadequate capability in feature extraction. To overcome these shortcomings, we propose a novel deeply supervised attention-guided network (DSA-Net) for BCD tasks in high-resolution images. In the DSA-Net, we innovatively introduce a spatial attention mechanism-guided cross-layer addition and skip-connection (CLA-Con-SAM) module to aggregate multi-level contextual information, weaken the heterogeneity between raw image features and difference features, and direct the network’s attention to changed regions. We also introduce an atrous spatial pyramid pooling (ASPP) module to extract multi-scale features. To further improve detection performance, we implement a new deep supervision module to enhance the ability of middle layers to extract more distinctive features. We conduct quantitative and qualitative experiments on the two publicly available datasets, i.e., the LEVIR-CD and the WHU Building datasets. Compared with the competing methods, the proposed DSA-Net achieves the best performance in all evaluation metrics. The efficiency analysis reveals that the proposed DSA-Net achieves a great balance between BCD performance and complexity/efficiency, with faster convergence and higher robustness.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.