Abstract

This paper describes the Evolutionary Create & Eliminate for Expectation Maximization algorithm (ECE-EM) for learning finite Gaussian Mixture Models (GMMs). The proposed algorithm is a variant of the recently proposed Evolutionary Split & Merge for Expectation Maximization algorithm (ESM-EM). ECE-EM uses simpler guiding functions and mutation operators compared to ESM-EM, while keeping the appealing properties of its counterpart. As an additional contribution of our work, we compare, in eighteen datasets, both ECE-EM and ESM-EM with two state-of-the-art algorithms able to learn the structure and the parameters of GMMs. Our experimental results suggest that both evolutionary algorithms present a sound tradeoff between computational time and accuracy when compared to the other algorithms. Furthermore, ECE-EM was able to obtain results at least as good as those achieved by ESM-EM.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call