Abstract

Cross-domain semantic segmentation, which aims to address the distribution shift while adapting from a labeled source domain to an unlabeled target domain, has achieved great progress in recent years. However, most existing work adopts a source-to-target adaptation path, which often suffers from clear class mismatching or class imbalance issues. We design PBAL, a prototypical bidirectional adaptation and learning technique that introduces bidirectional prototype learning and prototypical self-training for optimal inter-domain alignment and adaptation. We perform bidirectional alignments in a complementary and cooperative manner which balances both dominant and tail categories as well as easy and hard samples effectively. In addition, We derive prototypes efficiently from a source-trained classifier, which enables class-aware adaptation as well as synchronous prototype updating and network optimization. Further, we re-examine self-training and introduce prototypical contrast above it which greatly improves inter-domain alignment by promoting better intra-class compactness and inter-class separability in the feature space. Extensive experiments over two widely studied benchmarks show that the proposed PBAL achieves superior domain adaptation performance as compared with the state-of-the-art.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call