Abstract
It is well-known that the EM algorithm generally converges to a local maximum likelihood estimate. However, there have been many evidences to show that the EM algorithm can converge correctly to the true parameters as long as the overlap of Gaussians in the sample data is small enough. This paper studies this correct convergence problem asymptotically on the EM algorithm for Gaussian mixtures. It has been proved that the EM algorithm becomes a contraction mapping of the parameters within a neighborhood of the consistent solution of the maximum likelihood when the measure of average overlap among Gaussians in the original mixture is small enough and the number of samples is large enough. That is, if the initial parameters are set within the neighborhood, the EM algorithm will always converge to the consistent solution, i.e., the expected result. Moreover, the simulation results further demonstrate that this correct convergence neighborhood becomes larger as the average overlap becomes smaller.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.