Abstract
Dimensionality reduction is in demand to reduce the complexity of solving large-scale problems with data lying in latent low-dimensional structures in machine learning and computer version. Motivated by such need, in this work we study the Restricted Isometry Property (RIP) of Gaussian random projections for low-dimensional subspaces in $\mathbb{R}^N$, and rigorously prove that the projection Frobenius norm distance between any two subspaces spanned by the projected data in $\mathbb{R}^n$ ($n<N$) remain almost the same as the distance between the original subspaces with probability no less than $1 - {\rm e}^{-\mathcal{O}(n)}$. Previously the well-known Johnson-Lindenstrauss (JL) Lemma and RIP for sparse vectors have been the foundation of sparse signal processing including Compressed Sensing. As an analogy to JL Lemma and RIP for sparse vectors, this work allows the use of random projections to reduce the ambient dimension with the theoretical guarantee that the distance between subspaces after compression is well preserved.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.