Abstract

Multi-label classification is an important research topic in machine learning, for which exploiting label dependencies is an effective modeling principle. Recently, probabilistic models have shown great potential in discovering dependencies among labels. In this paper, motivated by the recent success of multi-view learning to improve the generalization performance, we propose a novel multi-view probabilistic model named latent conditional Bernoulli mixture (LCBM) for multi-label classification. LCBM is a generative model taking features from different views as inputs, and conditional on the latent subspace shared by the views a Bernoulli mixture model is adopted to build label dependencies. Inside each component of the mixture, the labels have a weak correlation which facilitates computational convenience. The mean field variational inference framework is used to carry out approximate posterior inference in the probabilistic model, where we propose a Gaussian mixture variational autoencoder (GMVAE) for effective posterior approximation. We further develop a scalable stochastic training algorithm for efficiently optimizing the model parameters and variational parameters, and derive an efficient prediction procedure based on greedy search. Experimental results on multiple benchmark datasets show that our approach outperforms other state-of-the-art methods under various metrics.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call