Abstract
Semantic segmentation is a critical problem for many remote sensing (RS) image applications. Benefiting from large-scale pixel-level labeled data and the continuous evolution of deep neural network architectures, the performance of semantic segmentation approaches has been constantly improved. However, deploying a well-trained model on unseen and diverse testing environments remains a major challenge: a large gap between data distributions in train and test domains results in severe performance loss, while manual dense labeling is costly and not scalable. To this end, we proposed an unsupervised domain adaptation framework for RS image semantic segmentation that is both practical and effective. The framework is supported by the consistency principle, including the cycle consistency in the input space and self-supervised consistency in the training stage. Specifically, we introduce cycle-consistent generative adversarial networks to reduce the discrepancy between source and target distributions by translating one into the other. The translated source data then drive a pipeline of supervised semantic segmentation model training. We enforce consistency of model predictions across target image transformations in order to provide self-supervision for the unlabeled target data. Experiments and extensive ablation studies demonstrate the effectiveness of the proposed approach on two challenging benchmarks, on which we achieve up to 9.95% and 7.53% improvements in accuracy over the state-of-the-art methods, respectively.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.