Abstract

Dimension reduction plays an essential role when decreasing the complexity of solving large-scale problems. The well-known Johnson-Lindenstrauss (JL) Lemma and Restricted Isometry Property (RIP) admit the use of random projection to reduce the dimension while keeping the Euclidean distance, which leads to the boom of sparsity related signal processing. Recently, successful applications of sparse models in computer vision and machine learning have increasingly hinted that the underlying structure of high dimensional data looks more like a union of subspaces (UoS). In this paper, motivated by JL Lemma, we study for the first time the distance-preserving property of Gaussian random projection matrices for two subspaces based on knowledge of Grassmann manifold. We theoretically prove that with high probability the affinity or the distance between two compressed subspaces are concentrated on their estimates. Numerical experiments verify the theoretical work.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call