Abstract
Change detection (CD) methods using synthetic aperture radar (SAR) data have received significant attention in the field of remote sensing Earth observation, which mainly involves knowledge-driven and data-driven approaches. Knowledge-driven CD methods are based on the physical theoretical models with strong interpretability, but they lack the robust features of being deeply mined. In contrast, data-driven CD methods can extract deep features, but require abundant training samples, which are difficult to obtain for SAR data. To address these limitations, an end-to-end unsupervised CD network based on self-adaptive superpixel segmentation is proposed. Firstly, reliable training samples were selected using an unsupervised pre-task. Then, the superpixel generation and Siamese CD network were integrated into the unified framework to train them end-to-end until the global optimal parameters were obtained. Moreover, the backpropagation of the joint loss function promoted the adaptive adjustment of the superpixel. Finally, the binary change map was obtained. Several public SAR CD datasets were used to verify the effectiveness of the proposed method. The transfer learning experiment was implemented to further explore the ability to detect the changes and generalization performance of our network. The experimental results demonstrate that our proposed method achieved the most competitive results, outperforming seven other advanced deep-learning-based CD methods. Specifically, our method achieved the highest accuracy in OA, F1-score, and Kappa, and also showed superiority in suppressing speckle noise, refining change boundaries, and improving detection accuracy in a small area change.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.