Abstract
Dimension reduction plays an essential role when decreasing the complexity of solving large-scale problems. The well-known Johnson-Lindenstrauss (JL) Lemma and Restricted Isometry Property (RIP) admit the use of random projection to reduce the dimension while keeping the Euclidean distance, which leads to the boom of Compressed Sensing and the field of sparsity related signal processing. Recently, successful applications of sparse models in computer vision and machine learning have increasingly hinted that the underlying structure of high dimensional data looks more like a union of subspaces (UoS). In this paper, motivated by JL Lemma and an emerging field of Compressed Subspace Clustering (CSC), we study for the first time the RIP of Gaussian random matrices for the compression of two subspaces based on the generalized projection $F$-norm distance. We theoretically prove that with high probability the affinity or distance between two projected subspaces are concentrated around their estimates. When the ambient dimension after projection is sufficiently large, the affinity and distance between two subspaces almost remain unchanged after random projection. Numerical experiments verify the theoretical work.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.