Abstract
It is quite an important and challenging problem for change detection (CD) from heterogeneous remote sensing images. The images obtained from different sensors (i.e., synthetic aperture radar (SAR) & optical camera) characterize the distinct properties of objects. Thus, it is impossible to detect changes by direct comparison of heterogeneous images. In this article, a new unsupervised change detection (USCD) method is proposed based on image translation. The cycle-consistent adversarial networks (CycleGANs) are employed to learn the subimage to subimage mapping relation using the given pair (i.e., before and after the event) of heterogeneous images from which the changes will be detected. Then, we can translate one image (e.g., SAR) from its original feature space (e.g., SAR) to another space (e.g., optical). By doing this, the pair of images can be represented in a common feature space (e.g., optical). The pixels with close pattern values in the before-event image may have quite different values in the after-event image if the change happens on some ones. Thus, we can generate the difference map between the translated before-event image and the original after-event image. Then, the difference map is divided into changed and unchanged parts. However, these detection results are not very reliable. We will select some significantly changed and unchanged pixel pairs from the two parts with the clustering technique (i.e., <inline-formula xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink"> <tex-math notation="LaTeX">$K$ </tex-math></inline-formula> -means). These selected pixel pairs are used to learn a binary classifier, and the other pixel pairs will be classified by this classifier to obtain the final CD results. Experimental results on different real datasets demonstrate the effectiveness of the proposed USCD method compared with several other related methods.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: IEEE Transactions on Geoscience and Remote Sensing
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.