Abstract

Many applications require the use of divergence measures between probability distributions. Several of these, such as the Kullback Leibler (KL) divergence and the Bhattacharyya divergence,aretractableforsingleGaussians,butintractableforcomplex distributions such as Gaussian mixture models (GMMs) used in speech recognizers. For tasks related to classification error, theBhattacharyyadivergenceisofspecialimportance. Here we derive efficient approximations to the Bhattacharyya divergence for GMMs, using novel variational methods and importance sampling. We introduce a combination of the two, variational importance sampling (VISa), which performs importance sampling using a proposal distribution derived from the variational approximation. VISa achieves the same accuracy as naive importance sampling at a fraction of the computation. Finally we apply the Bhattacharyya divergence to compute word confusability and compare the corresponding estimates using the KL divergence. Index Terms: Variational importance sampling, Bhattacharyya divergence, variational methods, Gaussian mixture models.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call