Abstract

This paper studies binary classification in robust one-bit compressed sensing with adversarial errors. It is assumed that the model is overparametrized and that the parameter of interest is effectively sparse. AdaBoost is considered, and, through its relation to the max $\ell\_1$-margin-classifier, prediction error bounds are derived. The developed theory is general and allows for heavy-tailed feature distributions, requiring only a weak moment assumption and an anti-concentration condition. Improved convergence rates are shown when the features satisfy a small deviation lower bound. In particular, the results provide an explanation why interpolating adversarial noise can be harmless for classification problems. Simulations illustrate the presented theory.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call