Abstract
Learning models that can generalize to previously unseen domains to which we have no access is a fundamental yet challenging problem in machine learning. In this paper, we propose meta variational inference (MetaVI), a variational Bayesian framework of meta-learning for cross domain image classification. Within the meta learning setting, MetaVI is derived to learn a probabilistic latent variable model by maximizing a meta evidence lower bound (Meta ELBO) for knowledge transfer across domains. To enhance the discriminative ability of the model, we further introduce a Wasserstein distance based constraint to the variational objective, leading to the Wasserstein MetaVI, which largely improves classification performance. By casting into a probabilistic inference problem, MetaVI offers the first, principled variational meta-learning framework for cross domain learning. In addition, we collect a new visual recognition dataset to contribute a more challenging benchmark for cross domain learning, which will be released to the public. Extensive experimental evaluation and ablation studies on four benchmarks show that our Wasserstein MetaVI achieves new state-of-the-art performance and surpasses previous methods, demonstrating its great effectiveness.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.