Abstract

Gaussian processes (GPs) constitute one of the most important Bayesian machine learning approaches. Several researchers have considered postulating mixtures of Gaussian processes as a means of dealing with non-stationary covariance functions, discontinuities, multi-modality, and overlapping output signals. In existing works, mixtures of Gaussian processes are based on the introduction of a gating function defined over the space of model input variables. This way, each postulated mixture component Gaussian process is effectively restricted in a limited subset of the input space. Additionally, the applicability of these models is limited to regression tasks. In this paper, for the first time in the literature, we devise a Gaussian process mixture model especially suitable for multiclass classification applications: We consider a GP classification scheme the prior distribution of which is a fully generative nonparametric Bayesian model with power-law behavior, generating Gaussian processes over the whole input space of the learned task. We provide an efficient algorithm for model inference, based on the variational Bayesian framework, and exhibit its efficacy using benchmark and real-world classification datasets.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call