Abstract

This work deals with the ill-posed inverse problem of reconstructing a function f given implicitly as the solution of g = Af, where A is a compact linear operator with unknown singular values and known eigenfunctions. We observe the function g and the singular values of the operator subject to Gaussian white noise with respective noise levels ϵ and σ. We develop a minimax theory in terms of both noise levels and propose an orthogonal series estimator attaining the minimax rates. This estimator requires the optimal choice of a dimension parameter depending on certain characteristics of f and A. This work addresses the fully data-driven choice of the dimension parameter combining model selection with Lepski's method. We show that the fully data-driven estimator preserves minimax optimality over a wide range of classes for f and A and noise levels ϵ and σ. The results are illustrated considering Sobolev spaces and mildly and severely ill-posed inverse problems.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call