Abstract

In this paper, we present a kernel trick embedded Gaussian Mixture Model (GMM), called kernel GMM. The basic idea is to embed kernel trick into EM algorithm and deduce a parameter estimation algorithm for GMM in feature space. Kernel GMM could be viewed as a Bayesian Kernel Method. Compared with most classical kernel methods, the proposed method can solve problems in probabilistic framework. Moreover, it can tackle nonlinear problems better than the traditional GMM. To avoid great computational cost problem existing in most kernel methods upon large scale data set, we also employ a Monte Carlo sampling technique to speed up kernel GMM so that it is more practical and efficient. Experimental results on synthetic and real-world data set demonstrate that the proposed approach has satisfing performance.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call