Abstract

Recently, the transformer has achieved notable success in remote sensing (RS) change detection (CD). Its outstanding long-distance modeling ability can effectively recognize the change of interest (CoI). However, in order to obtain the precise pixel-level change regions, many methods directly integrate the stacked transformer blocks into the UNet-style structure, which causes the high computation costs. Besides, the existing methods generally consider bitemporal or differential features separately, which makes the utilization of ground semantic information still insufficient. In this paper, we propose the multiscale dual-space interactive perception network (MDIPNet) to fill these two gaps. On the one hand, we simplify the stacked multi-head transformer blocks into the single-layer single-head attention module and further introduce the lightweight parallel fusion module (LPFM) to perform the efficient information integration. On the other hand, based on the simplified attention mechanism, we propose the cross-space perception module (CSPM) to connect the bitemporal and differential feature spaces, which can help our model suppress the pseudo changes and mine the more abundant semantic consistency of CoI. Extensive experiment results on three challenging datasets and one urban expansion scene indicate that compared with the mainstream CD methods, our MDIPNet obtains the state-of-the-art (SOTA) performance while further controlling the computation costs.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.