Abstract

In this paper, we study the connections between Rényi entropy PCA, kernel learning and graph embedding. A natural complementary formulation of maximum entropy PCA, namely minimum error entropy PCA, is presented. These two formulations can be combined together to give a two-fold understanding of Rényi entropy PCA. Further, we establish connections between Rényi entropy PCA, kernel learning and graph embedding, and propose a generalized graph embedding framework that unifies a variety of existing algorithms. This proposed framework essentially covers previous graph embedding framework, and partially answers the problem of how to make use of high order statistics of data in dimensionality reduction. The theoretic development enables a close relationship between information theoretic learning, kernel learning and graph embedding.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call