Abstract

Unsupervised domain adaptation (UDA) has become an important technique for cross-domain semantic segmentation (SS) in the remote sensing community and obtained remarkable results. However, when transferring from high-resolution (HR) remote sensing images to low-resolution (LR) images, the existing UDA frameworks always fail to segment the LR target images, especially for small objects (e.g., cars), due to the severe spatial resolution shift problem. In this article, to improve the segmentation ability of UDA models for LR target images and small objects, we propose a novel multitask domain adaptation network (DASRSNet) for SS of remote sensing images with the aid of super-resolution (SR). The proposed DASRSNet contains domain adaptation for SS (DASS) branch, domain adaptation for SR (DASR) branch, and feature affinity (FA) module. Specifically, the DASS and DASR branches share the same encoder to extract the domain-invariant features for the target and source domains, and these two branches utilize different decoders and discriminators to conduct cross-domain SS task and SR task, which align the domain shift in output space and image space, respectively. Finally, the FA module, which involves the proposed FA loss, is applied to enhance the affinity of SS features and SR features for both source and target domains. The experimental results on the cross-city aerial datasets demonstrate the effectiveness and superiority of our DASRSNet against the recent UDA models.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.