Abstract

In the high dimensional regression and classification, we often need the feature selection and the dimensionality reduction to cope with the huge computational cost and the over-fitting of the parameters. Canonical Correlation Analysis (CCA) and their hierarchical extension (includes Bayesian method) was proposed for this purpose. However, the real data set often violates the assumption of the linearity of CCA. Thus, we need the non-linear extension of them. To solve this problem, we propose the Bayesian mixture of CCA and give the efficient inference algorithm by Gibbs sampling. We show that the proposed method is the scalable natural extension of CCA and RBF type neural networks for the high dimensional non-linear problems.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call