Abstract

For the problem of nonparametric regression of smooth functions, we reconsider and analyze a constrained variational approach, which we call the MultIscale Nemirovski-Dantzig (MIND) estimator. This can be viewed as a multiscale extension of the Dantzig selector (\emph{Ann. Statist.}, 35(6): 2313--51, 2009) based on early ideas of Nemirovski (\emph{J. Comput. System Sci.}, 23:1--11, 1986). MIND minimizes a homogeneous Sobolev norm under the constraint that the multiresolution norm of the residual is bounded by a universal threshold. The main contribution of this paper is the derivation of convergence rates of MIND with respect to $L^q$-loss, $1 \le q \le \infty$, both almost surely and in expectation. To this end, we introduce the method of approximate source conditions. For a one-dimensional signal, these can be translated into approximation properties of $B$-splines. A remarkable consequence is that MIND attains almost minimax optimal rates simultaneously for a large range of Sobolev and Besov classes, which provides certain adaptation. Complimentary to the asymptotic analysis, we examine the finite sample performance of MIND by numerical simulations.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call