Abstract

This work reviews and extends a family of log-determinant (log-det) divergences for symmetric positive definite (SPD) matrices and discusses their fundamental properties. We show how to use parameterized Alpha-Beta (AB) and Gamma log-det divergences to generate many well-known divergences; in particular, we consider the Stein’s loss, the S-divergence, also called Jensen-Bregman LogDet (JBLD) divergence, Logdet Zero (Bhattacharyya) divergence, Affine Invariant Riemannian Metric (AIRM), and other divergences. Moreover, we establish links and correspondences between log-det divergences and visualise them on an alpha-beta plane for various sets of parameters. We use this unifying framework to interpret and extend existing similarity measures for semidefinite covariance matrices in finite-dimensional Reproducing Kernel Hilbert Spaces (RKHS). This paper also shows how the Alpha-Beta family of log-det divergences relates to the divergences of multivariate and multilinear normal distributions. Closed form formulas are derived for Gamma divergences of two multivariate Gaussian densities; the special cases of the Kullback-Leibler, Bhattacharyya, Rényi, and Cauchy-Schwartz divergences are discussed. Symmetrized versions of log-det divergences are also considered and briefly reviewed. Finally, a class of divergences is extended to multiway divergences for separable covariance (or precision) matrices.

Highlights

  • There is a close connection between divergence and the notions of entropy, information geometry, and statistical mean [2,4,5,6,7], while matrix divergences are closely related to the invariant geometrical properties of the manifold of probability distributions [4,8,9,10]

  • Several forms of the log-det divergence exist in the literature, including the log–determinant α divergence, Riemannian metric, Stein’s loss, S-divergence, called the Jensen-Bregman LogDet (JBLD) divergence, and the symmetrized Kullback-Leibler Density Metric (KLDM) or Jeffrey’s KL divergence [5,6,14,17,18,19,20]

  • We presented novelsimilarity measures; in particular, we considered the Alpha-Beta and Gamma log-det divergences that smoothly connect or unify a wide class of existing divergences for symmetric positive definite (SPD) matrices

Read more

Summary

Introduction

Divergences or (dis)similarity measures between symmetric positive definite (SPD) matrices underpin many applications, including: Diffusion Tensor Imaging (DTI) segmentation, classification, clustering, pattern recognition, model selection, statistical inference, and data processing problems [1,2,3]. Several forms of the log-det divergence exist in the literature, including the log–determinant α divergence, Riemannian metric, Stein’s loss, S-divergence, called the Jensen-Bregman LogDet (JBLD) divergence, and the symmetrized Kullback-Leibler Density Metric (KLDM) or Jeffrey’s KL divergence [5,6,14,17,18,19,20] Despite their numerous applications, common theoretical properties and the relationships between these divergences have not been established. The divergences discussed in this paper are robust with respect to outliers and noise if the tuning parameters, α, β, and γ, are chosen properly

Preliminaries
Basic Alpha-Beta Log-Determinant Divergence
Special Cases of the AB Log-Det Divergence
Properties of the AB Log-Det Divergence
Relative invariance for scale transformation
Symmetrized AB Log-Det Divergences
Measuring the Dissimilarity with a Divergence Lower-Bound
Similarity Measures Between Regularized Covariance Descriptors
Symmetry
The AB Log-Det Divergence for Noisy and Ill-Conditioned Covariance Matrices
Conclusions
Basic operations for positive definite matrices
Proof of the Properties of the AB Log-Det Divergence
Proof of Theorem 3
Gamma Divergence for Multivariate Gaussian Densities
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call