In this study, we utilize Gaussian Mixture Model (GMM) and propose a novel learn algorithm to approximate any density in a fast and simple way. In our previous study, we proposed a idea called GMM expansion which inspired by Fourier expansion. Similar to the base of frequencies in Fourier expansion, GMM expansion assume that normal distributions can be placed evenly along the support as a set of bases to approximate a large set of distribution in good accuracy. In this work, a new algorithm is proposed base on the idea of GMM expansion. A theoretical analysis also given to verify the convergence. Various experiments are carried out to exam the efficacy of proposed method. Experiment result demonstrate the advantages of proposed method and support that this new algorithm perform faster, is more accurate, has better stability, and is easier to use than the Expectation Maximization (EM) algorithm. Furthermore, the benefits of this proposed method helps improve the integration of GMM in neural network. The experiment results show that the neural network with our proposed method significantly improves ability to handle the inverse problem and data uncertainty. Finally, another application, a GMM-based neural network generator, is built. This application shows the potential to utilize distribution random sampling for feature variation control in generative mode.
Read full abstract