Abstract

Images from different sensors can be different in intensity or data structure to present the same object on the ground. The distinct appearances make the change detection task more difficult to obtain accurate change regions. This letter presents a novel bipartite adversarial autoencoders with structural self-similarity (BASNet) for detecting land cover changes in heterogeneous remote sensing images. The main novelty lies in the following two aspects. First, a structural consistency loss is defined by the crossmodal distance in a new affinity space. It enforces the network to transform the heterogeneous images into a common domain for style alignment of the original image and the transformed image. Second, an adversarial loss term is designed to force the network to make image translation with more consistent style by distinguishing the artificial output from the real input pixels. Experiments on four heterogeneous remote sensing image datasets are provided to demonstrate the performances of the proposed method.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call