Abstract

Image recognition tasks involve an increasingly high amount of symmetric positive definite (SPD) matrices data. SPD manifolds exhibit nonlinear geometry, and Euclidean machine learning methods cannot be directly applied to SPD manifolds. The kernel trick of SPD manifolds is based on the concept of projecting data onto a reproducing kernel Hilbert space. Unfortunately, existing kernel methods do not consider the connection of SPD matrices and linear projections. Thus, a framework that uses the correlation between SPD matrices and projections to model the kernel map is proposed herein. To realize this, this paper formulates a Hilbert–Schmidt independence criterion (HSIC) regularization framework based on the kernel trick, where HSIC is usually used to express the interconnectedness of two datasets. The proposed framework allows us to extend the existing kernel methods to new HSIC regularization kernel methods. Additionally, this paper proposes an algorithm called HSIC regularized graph discriminant analysis (HRGDA) for SPD manifolds based on the HSIC regularization framework. The proposed HSIC regularization framework and HRGDA are highly accurate and valid based on experimental results on several classification tasks.

Highlights

  • Vision recognition tasks are often encountered in real-life application [1,2,3]

  • Symmetric positive definite (SPD) matrices [6] have received more and more attention in terms of region covariance descriptor [7,8,9], Gaussian mixture model (GMM) [10], diffusion tensors [11, 12], and structure tensors [13, 14]. ese descriptors utilize second-order statistical information to capture the correlation between different features and are effective in various applications [15,16,17,18,19]. e symmetric positive definite (SPD) matrices lie on an SPD manifold when endowed with an appropriate Riemannian metric

  • Where W is the connection matrix defined as 1 qc if li lj, (7). Both Riemannian locality preserving projections (RLPP) and Covariance discriminative learning (CDL) are constructed based on the traditional kernel framework, which means that the map from the input space to the projective space is a linear assumption. us, this paper proposes the Hilbert–Schmidt independence criterion (HSIC) regularization kernel framework by introducing statistical correlation between SPD matrices and low-dimensional representations

Read more

Summary

Introduction

Vision recognition tasks are often encountered in real-life application [1,2,3]. Most traditional image recognition algorithms are constructed in the Euclidean space [4, 5]. Most existing classification methods on SPD manifolds employ Riemannian metrics and matrix divergences as the dissimilarity measurement [22,23,24,25,26], e.g., the log-Euclidean Riemannian metric (LERM) [21] and the affine-invariance Riemannian metric (AIRM) [20], and Jensen–Bregman LogDet divergence (JBLD) [27, 28] is not a real Riemannian metric, it facilitates a quick and approximate computation of the distance. (3) A method called HSIC regularized graph discriminant analysis (HRGDA) on SPD manifolds is proposed based on the HSIC regularization kernel framework. HRGDA uses a LogDet divergence kernel for embedding and a variant of kernel linear discriminant analysis (LDA) for learning

Related Work
Preliminaries
HSIC Regularization
Methods
Conclusions
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call