Abstract
This paper studies how to learn parameters in diagonal Gaussian mixture models. The problem can be formulated as computing incomplete symmetric tensor decompositions. We use generating polynomials to compute incomplete symmetric tensor decompositions and approximations. Then the tensor approximation method is used to learn diagonal Gaussian mixture models. We also do the stability analysis. When the first and third order moments are sufficiently accurate, we show that the obtained parameters for the Gaussian mixture models are also highly accurate. Numerical experiments are also provided.
Highlights
A Gaussian mixture model consists of several component Gaussian distributions
This paper gives a new algorithm for learning Gaussian mixture models with diagonal covariance matrices
We first give a method for computing incomplete symmetric tensor decompositions
Summary
A Gaussian mixture model consists of several component Gaussian distributions. For given samples of a Gaussian mixture model, people often need to estimate parameters for each component Gaussian distribution [24, 32]. Learning a Gaussian mixture model is to estimate the parameters ωi, μi, Σi for each i ∈ [r], from given samples of y. We can still determine some generating polynomials, from the partially given tensor entries Fi1i2i3 with (i1, i2, i3) ∈ Ω They can be used to get the incomplete tensor decomposition. The parameters for the Gaussian mixture model can be recovered from the incomplete tensor decomposition of F. Hsu and Kakade [27] provided a learning algorithm for a mixture of spherical Gaussians, i.e., each covariance matrix is a multiple of the identity matrix This method is based on moments up to order three and only assumes non-degeneracy instead of separations. Contributions This paper proposes a new method for learning diagonal Gaussian mixture models, based on samplings for the first and third order moments.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have