Abstract

While it is well-known that nonlinear methods of approximation can often perform dramatically better than linear methods, there are still questions on how to measure the optimal performance possible for such methods. This paper studies nonlinear methods of approximation that are compatible with numerical implementation in that they are required to be numerically stable. A measure of optimal performance, called stable manifold widths, for approximating a model class K in a Banach space X by stable manifold methods is introduced. Fundamental inequalities between these stable manifold widths and the entropy of K are established. The effects of requiring stability in the settings of deep learning and compressed sensing are discussed.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call