Abstract

Abstract In this paper we consider variational regularization methods for inverse problems with large noise that is in general unbounded in the image space of the forward operator. We introduce a Banach space setting that allows to define a reasonable notion of solutions for more general noise in a larger space provided that one has sufficient mapping properties of the forward operators. A key observation, which guides us through the subsequent analysis, is that such a general noise model can be understood with the same setting as approximate source conditions (while a standard model of bounded noise is related directly to classical source conditions). Based on this insight we obtain a quite general existence result for regularized variational problems and derive error estimates in terms of Bregman distances. The latter is specialized for the particularly important cases of one- and $p$-homogeneous regularization functionals. As a natural further step we study stochastic noise models and in particular white noise for which we derive error estimates in terms of the expectation of the Bregman distance. The finiteness of certain expectations leads to a novel class of abstract smoothness conditions on the forward operator, which can be easily interpreted in the Hilbert space case. We finally exemplify the approach and in particular the conditions for popular examples of regularization functionals given by squared norm, Besov norm and total variation.

Highlights

  • Motivated by stochastic modeling of noise, in particular white noise, the treatment of inverse problems with large noise has received strong attention recently (Egger, 2008; Eggermont et al, 2009; Mathé & Tautenhahn, 2011; Kekkonen et al, 2014; Kekkonen et al, 2015)

  • Some difficulties related to the appropriate formulation of the regularized problem with white noise are not appearing in this way

  • Due to the analogous role of μ† and η it is natural to use the same paradigm for approximating the large noise and this is the basic foundation of the analysis in this paper

Read more

Summary

Introduction

Motivated by stochastic modeling of noise, in particular white noise, the treatment of inverse problems with large noise has received strong attention recently (Egger, 2008; Eggermont et al, 2009; Mathé & Tautenhahn, 2011; Kekkonen et al, 2014; Kekkonen et al, 2015). Due to the analogous role of μ† and η it is natural to use the same paradigm for approximating the large noise and this is the basic foundation of the analysis in this paper Following this idea our key contribution is to derive Bregman-distance-based error estimates between uδα and u† for a general convex R. Given a deterministic noise model one can derive explicit converge rate results given (a variation of) an approximate source condition on μ† and η. In this work our interest lies in the frequentist risk between the estimator Uαδ = Uαδ (ω) and the true unknown u† In such paradigm we find that the expected decay rate of the approximate source condition of the noise term is sufficient to guarantee a convergence rate result.

Existence and a priori estimates
Basic ingredients of error estimates
A variation on approximate source condition
Error estimates
Convergence rates for homogeneous regularizations
Regularization by one-homogeneous functionals
Hilbert space embedding
Frequentist framework
Gaussian case
Besov penalty
Outlook to the Bayesian approach
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.