Abstract

We consider statistical linear inverse problems in separable Hilbert spaces and filter-based reconstruction methods of the form fˆα=qαT∗TT∗Y , where Y is the available data, T the forward operator, qαα∈A an ordered filter, and α > 0 a regularization parameter. Whenever such a method is used in practice, α has to be appropriately chosen. Typically, the aim is to find or at least approximate the best possible α in the sense that mean squared error (MSE) E[∥fˆα−f†∥2] w.r.t. the true solution f† is minimized. In this paper, we introduce the Sharp Optimal Lepskiĭ-Inspired Tuning (SOLIT) method, which yields an a posteriori parameter choice rule ensuring adaptive minimax rates of convergence. It depends only on Y and the noise level σ as well as the operator T and the filter qαα∈A and does not require any problem-dependent tuning of further parameters. We prove an oracle inequality for the corresponding MSE in a general setting and derive the rates of convergence in different scenarios. By a careful analysis we show that no other a posteriori parameter choice rule can yield a better performance in terms of the order of the convergence rate of the MSE. In particular, our results reveal that the typical understanding of Lepskiĭ-type methods in inverse problems leading to a loss of a log factor is wrong. In addition, the empirical performance of SOLIT is examined in simulations.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call