Abstract
With the increase of the dimension of data, dimensionality reduction, such as random projection, becomes necessary. In many cases, the underlying structure of data is a union of subspaces, and the relative position between subspaces can be well described by principal angles. Motivated by the conjecture whether the principal angles between subspaces can remain almost unchanged after Gaussian random projection, in this work, we prove that the principal angle preserving property holds with probability exponentially close to 1. We use more advanced techniques compared with our previous work, and the improved conclusion is more rigorous in theory and more useful in practice. Principal angles are so essential in the relation between subspaces that many works on algorithms for data mining regarding subspaces are based on principal angles, so this work may contribute to extending them to compressive scenarios, in which the computational complexity can be fundamentally reduced due to the random dimensionality reduction. Experiments on real datasets verify that with a compression ratio as small as 0.05 the principal angles between subspaces after random projection can be sufficiently close to the original principal angles.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.