Abstract
The Gaussian mixture model (GMM) is a widely used probabilistic clustering model. The incremental learning algorithm of GMM is the basis of a variety of complex incremental learning algorithms. It is typically applied to real-time or massive data problems where the standard Expectation Maximum (EM) algorithm does not work. But the output of the incremental learning algorithm may exhibit degraded cluster quality than the standard EM algorithm. In order to achieve a high-quality and fast incremental GMM learning algorithm, we develop an algorithmic method for incremental learning of GMM in a GPU-CPU hybrid system. Our method uses model evolution history to approximate the model order and adopts both hypothesis-test and Euclidean distance to do mixture component equality test. Through experiments we show that our method achieves high performance in terms of both cluster quality and speed.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.