Abstract

Canonical correlation analysis (CCA) is a popular method that has been widely used in information fusion. However, CCA requires that the data from two views must be paired, which is hard to satisfy in the real applications, moreover, it only considers the correlated information of the paired data. Thus, it cannot be used when there are only a little paired data or no paired data. In this paper, we propose a novel method named Canonical Principal Angles Correlation Analysis (CPACA) which does not need paired data during training stage. It makes classic CCA escape from the limitation of paired information. Its objective function can be constructed as follows: First, the correlation of two views is represented by the similarity between two subspace spanned by the principal components, which makes CPACA favorably compare with CCA in the case of limited paired data; Second, in order to increase the discriminative information of CPACA, we utilize manifold regularization to exploit the geometry of the marginal distribution. To optimize the objective function, we propose a new method to calculate the projected vectors. The experimental results show that the performance of CPACA is superior to that of traditional CCA and its variants.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call