Abstract

Motivated by statistical learning theoretic treatment of principal component analysis, we are concerned with the set of points in ℝ d that are within a certain distance from a k-dimensional affine subspace. We prove that the VC dimension of the class of such sets is within a constant factor of (k+1)(d−k+1), and then discuss the distribution of eigenvalues of a data covariance matrix by using our bounds of the VC dimensions and Vapnik’s statistical learning theory. In the course of the upper bound proof, we provide a simple proof of Warren’s bound of the number of sign sequences of real polynomials.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call