Abstract

Real-time learning requires on-line complexity estimation. Expectation-maximisation (EM) and sampling techniques are presented that enable simultaneous estimation of the complexity and continuous parameters of Gaussian mixture models (GMMs) which can be used for density estimation, classification and feature extraction. The solution is a maximum a posteriori probability (MAP) estimator that is convergent for fixed data and adaptive with accruing data. Issues resolved include estimating the priors for element covariances, means and weights and calculating the local integrated likelihood (evidence) of the solution. The EM algorithm for MAP estimation of GMM parameters is established and extended to include complexity estimation (i.e. iterative pruning). The EMS algorithm is introduced which incorporates a sampling stage that enables iterative growth of the GMM. Early trials involving speech data indicate that the likelihood of hidden Markov speech models can be very substantially increased using this approach.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call