Abstract

This paper gives a detailed introduction of implementing mixture of Gaussian process (MGP) model and develops its application for Bayesian optimization (BayesOpt). The paper also develops techniques for MGP in finding its mixture components and introduced an alternative gating network based on the Dirichlet distributions. BayesOpt using the resultant MGP model significantly outperforms the one based on Gaussian process regression in terms of optimization efficiency in the test on tuning the hyperparameters in common machine learning algorithms. This indicates the success of the methods, implying a promising future of wider application for MGP model and the BayesOpt based on it.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call