Abstract

In this note, we develop fast and deterministic dimensionality reduction techniques for a family of subspace approximation problems. We then utilize these dimensionality reduction techniques to help rapidly and accurately approximate the \(n\)-widths of point sets. Let \(P \subset \mathbb {R}^N\) be a given set of \(M\) points. The techniques developed herein find an \(O(n \log M)\)-dimensional subspace that is guaranteed to always contain a near-best fit \(n\)-dimensional hyperplane \(\mathcal {H}\) for \(P\) with respect to the cumulative projection error \(\left( \sum _{\mathbf{x} \in P} \Vert \mathbf{x} - \Pi _{\mathcal {H}} \mathbf{x} \Vert ^p_2 \right) ^{1/p}\), for any chosen \(p > 2\). The deterministic algorithm runs in \(\tilde{O} \left( MN^2 \right) \)-time, and can be randomized to run in only \(\tilde{O} \left( MNn \right) \)-time while maintaining its error guarantees with high probability. In the important \(p = \infty \) case, the dimensionality reduction techniques are then combined with efficient algorithms for computing the John ellipsoid of a data set in order to produce an \(n\)-dimensional subspace whose maximum \(\ell _2\)-distance to any point in the convex hull of \(P\) is minimized. The resulting algorithm remains in \(\tilde{O} \left( MNn \right) \)-time.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call