Abstract

In the context of nonparametric regression and inverse problems, variational multiscale methods combine multiscale dictionaries with regularization functionals in a variational framework. In recent years, these methods have gained popularity in nonparametric statistics due to their good reconstruction properties. Nevertheless, their theoretical performance is, with few exceptions, poorly understood. In this thesis we apply variational multiscale methods to the estimation of functions of bounded variation ($BV$). $BV$ functions are relevant in many applications, since they involve minimal smoothness assumptions and give simplified and interpretable reconstructions. These functions are however remarkably difficult to analyze, and there is to date no statistical theory for the estimation of $BV$ functions in dimension $d\geq 2$. The main theoretical contribution of this thesis is the proof that a class of multiscale estimators with a $BV$ penalty is minimax optimal up to logarithms for the estimation of $BV$ functions in regression and inverse problems in any dimension. Conceptually, our proof exploits a connection between multiscale dictionaries and Besov spaces. Besides the theoretical analysis, in this thesis we consider the efficient implementation and computation of the estimator, and illustrate it in a simulation study.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call