Abstract
We study the optimal convergence rate for the universal estimation error. Let mathcal {F} be the excess loss class associated with the hypothesis space and n be the size of the data set, we prove that if the Fat-shattering dimension satisfies text {fat}_{epsilon } (mathcal {F})= O(epsilon ^{-p}), then the universal estimation error is of O(n^{-1/2}) for p<2 and O(n^{-1/p}) for p>2. Among other things, this result gives a criterion for a hypothesis class to achieve the minimax optimal rate of O(n^{-1/2}). We also show that if the hypothesis space is the compact supported convex Lipschitz continuous functions in mathbb {R}^d with d>4, then the rate is approximately O(n^{-2/d}).
Paper version not known (Free)
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have