Abstract

Subspace learning aims to retain the desirable data properties and reduce the data dimensionality by projecting high dimensional data into low dimensional subspace, which is a hot spot in machine learning and pattern recognition communities. In graph based subspace learning, the quality of constructed graph greatly affects the subsequent projection matrix learning. A common case is when data is noisy or even grossly corrupted, the constructed graph usually cannot well respect to the inner structure of data. Additionally, the widely used two-stage strategy which learns the projection matrix on a fixed data graph isolates the two closely related stages. To this end, we propose a joint kernel low-rank graph construction and subspace learning (KLGSL) model to alleviate the aforementioned disadvantages based on the theory of low-rank matrix recovery and spectral regression. In KLGSL, the kernel low-rank representation is used to characterize the possible nonlinear structure in data and the objectives of kernel low-rank representation and spectral regression co-evolves to optimum; therefore, the subspace learning can be efficiently performed on the recovered data contributed by the low-rank learning and it merges the two separated processes of graph construction and subspace learning into one whole. The KLGSL model objective can be efficiently optimized under the augmented Lagrange multiplier method framework. We evaluate the high performance of KLGSL by conducting extensive experiments on representative benchmark data sets and the results show that the low-rank learning can greatly facilitate the process of subspace learning.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call