Abstract

A new hybrid evolutionary algorithm (EA) for Gaussian mixture model-based clustering is proposed. The EA is a steady-state method that, in each generation, selects two individuals from a population, creates two offspring using either mutation or crossover, and fine-tunes the offspring using the expectation maximization (EM) algorithm. The offspring compete with their parents for survival into the next generation. The approach proposed uses a random swap, which replaces a component mean with a randomly chosen feature vector as a mutation operator. In the crossover operator, a random component is copied from the source mixture into a destination mixture. Copying in crossover favors components of the source mixture located away from the components of the destination mixture. In computational experiments, the approach was compared to a multiple restarts EM algorithm, a random swap EM method, and a state-of-the-art hybrid evolutionary algorithm for Gaussian mixture model learning on one real and 29 synthetic datasets. The results indicate that, given the same computational budget, the proposed method usually learns mixtures with higher log-likelihoods than other benchmarks competing algorithms. The partitions of data obtained by the method correspond best to the original divisions of datasets into classes.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.