Abstract

A new log-likelihood (LL) based metric for goodness-of-fit testing and monitoring unsupervised learning of mixture densities is introduced, called differential LL. We develop the metric in the case of a Gaussian kernel fitted to a Gaussian distribution. We suggest a possible differential LL learning strategy, show the formal link with the Kullback–Leibler divergence and the quantization error, and introduce a Gaussian factorial distribution approximation by subspaces.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call