Abstract
Real-time learning requires on-line complexity estimation. Expectation-maximisation (EM) and sampling techniques are presented that enable simultaneous estimation of the complexity and continuous parameters of Gaussian mixture models (GMMs) which can be used for density estimation, classification and feature extraction. The solution is a maximum a posteriori probability (MAP) estimator that is convergent for fixed data and adaptive with accruing data. Issues resolved include estimating the priors for element covariances, means and weights and calculating the local integrated likelihood (evidence) of the solution. The EM algorithm for MAP estimation of GMM parameters is established and extended to include complexity estimation (i.e. iterative pruning). The EMS algorithm is introduced which incorporates a sampling stage that enables iterative growth of the GMM. Early trials involving speech data indicate that the likelihood of hidden Markov speech models can be very substantially increased using this approach.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.