Abstract

This paper concerns the minimax center of a collection of linear subspaces. For k-dimensional subspaces of an n-dimensional vector space, this can be cast as finding the center of a minimum enclosing ball on a Grassmann manifold. For subspaces of differing dimension, the setting becomes a disjoint union of Grassmannians rather than a single manifold, and the problem is no longer well-defined. However, natural geometric maps exist between these manifolds with a well-defined notion of distance for the images of the subspaces under the mappings. Solving the problem in this context leads to a candidate minimax center on each of the constituent manifolds, but does not provide intuition about which candidate is the best representation of the data. Additionally, the solutions of different rank are generally not nested so a deflationary approach will not suffice, and the problem must be solved independently on each manifold. We propose an optimization problem parametrized by the rank of the minimax center. The solution is computed with a subgradient algorithm applied to the dual problem. By scaling the objective and penalizing the information lost by the rank-k minimax center, we jointly recover an optimal dimension, k⁎, and a subspace at the center of the minimum enclosing ball, U⁎, that best represents the data.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call