Abstract

Subspace learning aims to obtain the corresponding low-dimensional representation of high dimensional data in order to facilitate the subsequent data storage and processing. Graph-based subspace learning is a kind of effective subspace learning methods by modeling the data manifold with a graph, which can be included in the general spectral regression (SR) framework. By using the least square regression form as objective function, spectral regression mathematically avoids performing eign-decomposition on dense matrices and has excellent flexibility. Recently, spectral regression has obtained promising performance in diverse applications; however, it did not take the underlying classes/tasks correlation patterns of data into consideration. In this paper, we propose to improve the performance of spectral regression by exploring the correlation among classes with low-rank modeling. The newly formulated low-rank spectral regression (LRSR) model is achieved by decomposing the projection matrix in SR by two factor matrices which were respectively regularized. The LRSR objective function can be handled by the alternating direction optimization framework. Besides some analysis on the differences between LRSR and existing related models, we conduct extensive experiments by comparing LRSR with its full rank counterpart on benchmark data sets and the results demonstrate its superiority.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call