Abstract
In remote sensing, change detection has always been a fundamental yet challenging research topic, with profound theoretical significance and extensive application value. Over the past decades, the emergence and development of deep learning has provided new technical supports for supervised change detection and advanced its accuracy to unprecedented levels. Nevertheless, owing to the strong reliance and weak transferability of pre-labeled references, supervised learning modes still require some degrees of human assistance, which is not applicable to all the change detection tasks. In addition, agnostic to any specific inherent property, changes may display inconstant and irregular characteristics when occurring between different land cover categories, making them incompatible with traditional end-to-end learning formats. In this research, we investigate the utilization of unsupervised deep learning mode, and develop a novel approach, namely content-invariant translation (CIT), for unsupervised change detection in bi-temporal remotely sensed images. In this method, a new framework integrating the adversarial learning algorithm and hybrid attention mechanism is designed to learn a one-sided cross-domain translation from the pre-event domain to the post-event one. During this process, a self-attention module focuses on small-scale image patches and ensures the content consistency of each pair of pre-event and translated patches, and meanwhile, a cross-domain module focuses on large-scale images and guarantees the style similarity of two groups of translated and post-event patches. After translation, the style discrepancies in bi-temporal images are suppressed while the real content changes are highlighted. Extensive experiments conducted on three typical datasets that with diverse types of changes verify the effectiveness and competitiveness of our newly proposed CIT by a large margin.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: ISPRS Journal of Photogrammetry and Remote Sensing
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.