Abstract

Learning models that can generalize to previously unseen domains to which we have no access is a fundamental yet challenging problem in machine learning. In this paper, we propose meta variational inference (MetaVI), a variational Bayesian framework of meta-learning for cross domain image classification. Within the meta learning setting, MetaVI is derived to learn a probabilistic latent variable model by maximizing a meta evidence lower bound (Meta ELBO) for knowledge transfer across domains. To enhance the discriminative ability of the model, we further introduce a Wasserstein distance based constraint to the variational objective, leading to the Wasserstein MetaVI, which largely improves classification performance. By casting into a probabilistic inference problem, MetaVI offers the first, principled variational meta-learning framework for cross domain learning. In addition, we collect a new visual recognition dataset to contribute a more challenging benchmark for cross domain learning, which will be released to the public. Extensive experimental evaluation and ablation studies on four benchmarks show that our Wasserstein MetaVI achieves new state-of-the-art performance and surpasses previous methods, demonstrating its great effectiveness.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call