Abstract

The Gaussian mixture model (GMM) is a widely used probabilistic clustering model. The incremental learning algorithm of GMM is the basis of a variety of complex incremental learning algorithms. It is typically applied to real-time or massive data problems where the standard Expectation Maximum (EM) algorithm does not work. But the output of the incremental learning algorithm may exhibit degraded cluster quality than the standard EM algorithm. In order to achieve a high-quality and fast incremental GMM learning algorithm, we develop an algorithmic method for incremental learning of GMM in a GPU-CPU hybrid system. Our method uses model evolution history to approximate the model order and adopts both hypothesis-test and Euclidean distance to do mixture component equality test. Through experiments we show that our method achieves high performance in terms of both cluster quality and speed.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call