Abstract

This work presents a novel parametrized family of Log-Determinant (Log-Det) divergences between positive definite unitized trace class operators on a Hilbert space. This is a generalization of the Log-Det divergences between symmetric, positive definite matrices to the infinite-dimensional setting. For the Log-Det divergences between covariance operators on a Reproducing Kernel Hilbert Space (RKHS), we obtain closed form solutions via the corresponding Gram matrices. By employing the Log-Det divergences, we then generalize the Bhattacharyya and Hellinger distances and the Kullback–Leibler and Rényi divergences between multivariate normal distributions to Gaussian measures on an infinite-dimensional Hilbert space.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call