Abstract

Canonical correlation analysis (CCA) is a powerful statistical tool quantifying correlations between two sets of multidimensional variables. CCA cannot detect nonlinear relationship, and it is costly to derive canonical variates for high-dimensional data. Kernel CCA, a nonlinear extension of the CCA method, can efficiently exploit nonlinear relations and reduce high dimensionality. However, kernel CCA yields the so called over-fitting phenomenon in the high-dimensional feature space. To handle the shortcomings of kernel CCA, this paper develops a novel robust kernel CCA algorithm (KCCA-ROB). The derived method begins with reformulating the traditional generalized eigenvalue–eigenvector problem into a new framework. Under this novel framework, we develop a stable and fast algorithm by means of singular value decomposition (SVD) method. Experimental results on both a simulated dataset and real-world datasets demonstrate the effectiveness of the developed method.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call