Abstract

The Gaussian process is a powerful statistical learning model and has been applied widely in nonlinear regression and classification. However, it fails to model multi-modal data from a non-stationary source since a prior Gaussian process is generally stationary. Based on the idea of the mixture of experts, the mixture of Gaussian processes was established to increase the model flexibility. On the other hand, the Gaussian process is also sensitive to outliers and thus robust Gaussian processes have been suggested to own the heavy-tailed property. In practical applications, the datasets may be multi-modal and contain outliers at the same time. In order to overcome these two difficulties together, we propose a mixture of robust Gaussian processes (MRGP) model and establish a precise hard-cut EM algorithm for learning its parameters. Since the exact solving process is intractable due to the fact that non-Gaussian probability density functions of the noises are adopted into the likelihood of the proposed model on the dataset, we employ a variational bounding method to approximate the marginal likelihood functions so that the hard-cut EM algorithm can be implemented effectively. Moreover, we conduct various experiments on both synthetic and real-world datasets to evaluate and compare our proposed MRGP method with several competitive nonlinear regression methods. The experimental results demonstrate that our MRGP model with the hard-cut EM algorithm is much more effective and robust than the competitive nonlinear regression models.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call