Abstract

Covariance matrices have attracted increasing attention for data representation in many computer vision tasks. The nonsingular covariance matrices are regarded as points on Riemannian manifolds rather than Euclidean space. A common technique for classification on Riemannian manifolds is to embed the covariance matrices into a reproducing kernel Hilbert space (RKHS), and then construct a map from RKHS to Euclidean space, while the explicit map from RKHS to Euclidean space in most kernel-based methods only depends on a linear hypothesis. In this paper, we propose a subspace learning framework to project Riemannian manifolds to Euclidean space, and give the theoretical derivation for it. Specifically, the Euclidean space is isomorphic to the subspace of RKHS. Under the framework, firstly we define an improved Log-Euclidean Gaussian radial basis function kernel for embedding. The first order statistical features of input images are incorporated into the kernel function to increase the discriminative power. After that we seek the optimal projection matrix of the subspace of the RKHS by conducting a graph embedding discriminant analysis. Texture recognition and object categorization experiments with region covariance descriptors demonstrate the considerable effectiveness of the improved Log-Euclidean Gaussian RBK kernel and the proposed method.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.