Abstract

Universal domain adaptation (UniDA) aims to transfer the knowledge learned from a labeled source domain to an unlabeled target domain without any constraints on the label sets. However, domain shift and category shift make UniDA extremely challenging, mainly attributed to the requirement of identifying both shared “known” samples and private “unknown” samples. Previous methods barely exploit the intrinsic manifold structure relationship between two domains for feature alignment, and they rely on the softmax-based scores with class competition nature to detect underlying “unknown” samples. Therefore, in this paper, we propose a novel evidential neighborhood contrastive learning framework called TNT to address these issues. Specifically, TNT first proposes a new domain alignment principle: semantically consistent samples should be geometrically adjacent to each other, whether within or across domains. From this criterion, a cross-domain multi-sample contrastive loss based on mutual nearest neighbors is designed to achieve common category matching and private category separation. Second, toward accurate “unknown” sample detection, TNT introduces a class competition-free uncertainty score from the perspective of evidential deep learning. Instead of setting a single threshold, TNT learns a category-aware heterogeneous threshold vector to reject diverse “unknown” samples. Extensive experiments on three benchmarks demonstrate that TNT significantly outperforms previous state-of-the-art UniDA methods.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.