Abstract

Gaussian processes (GPs) are powerful Bayesian nonparametric tools widely used in probabilistic modeling, and the mixture of GPs (MGPs) were introduced afterwards to make data modeling more flexible. However, MGPs are not directly applicable to multiview learning. In order to improve the modeling ability of MGPs, in this paper, we propose a new framework of multiview learning for the MGPs and instantiate it for classification. We make the divergence between views as small as possible while ensuring that the posterior probability of each view is as large as possible. Specifically, we regularize the posterior distribution of latent variables with the consistency of posterior distributions of the latent functions between different views. Since it is intractable to solve the model analytically, the variational inference and optimization algorithms of the classification model are also presented in this paper. Experimental results on multiple real-world datasets have shown that the proposed method has outperformed the original MGP model and several state-of-the-art multiview learning methods, which indicate the effectiveness of the proposed multiview learning framework for MGPs.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call