Abstract

We extend the EM algorithm to overcome its bottleneck, that is to say, the problem of local maxima of the marginal likelihood due to its strong dependence of initial conditions. As an alternative posterior distribution appearing in the so-called Q function, we use the distribution that maximizes the non-extensive Tsallis entropy. The distribution we introduce has a parameter q which represents the non-extensive property of the entropy. We control the parameter q so as to weaken the influence of the initial conditions. In order to investigate its performance, we apply our algorithm to Gaussian mixture estimation problems under some additive noises. In large data limit, we derive the averaged update equations with respect to hyper-parameters, marginal likelihood etc. analytically. Our analysis supports usefulness of our algorithm.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call