Abstract

This paper addresses the problem of initialization of the expectation-maximization (EM) algorithm for maximum likelihood estimation of Gaussian mixture models. In order to avoid local maxima of the likelihood function, a genetic algorithm (GA) which searches for best initial conditions of the EM algorithm is proposed. In the GA, a chromosome represents a set of initial conditions, in which initial mean vectors of mixture components are feature vectors chosen from the training set. The chromosome also encodes variances of initial spherical covariance matrices of mixture components. To evaluate each chromosome in the GA we run the EM algorithm until convergence and use the obtained log likelihood as the fitness. In computational experiments our approach was applied to clustering problem and tested on two datasets from the image processing domain. The results indicate that our method outperforms the standard multiple restart EM algorithm and is at least comparable to the state-of-the art random swap EM method.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call