Abstract

Gaussian mixture model has been used extensively in the fields of information processing and data analysis. However, its model selection, i.e., the selection of number of components or Gaussians in the mixture, is still a difficult problem. Fortunately, the new established Bayesian Ying–Yang (BYY) harmony function provides an efficient criterion for the model selection of Gaussian mixture with a set of sample data. In this paper, we propose a BYY scale-incremental EM algorithm for Gaussian mixture learning via a component split rule to increase the BYY harmony function incrementally. Particularly, starting from two components and adding one component sequentially via the split rule after each EM procedure until a maximum number of components, the algorithm increases the scale of the mixture incrementally and leads to the maximization of the BYY harmony function, together with the correct model selection and a good parameter estimation of the Gaussian mixture. It is demonstrated well by the simulation experiments that this BYY scale-incremental EM algorithm can make both model selection and parameter estimation efficiently for Gaussian mixture modeling. Moreover, the BYY scale-incremental EM algorithm is successfully applied to two real-life data sets, including Iris data classification and unsupervised color image segmentation.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.