Abstract

This article deals with the solution of linear ill-posed equations in Hilbert spaces. Often, one only has a corrupted measurement of the right hand side at hand and the Bakushinskii veto tells us, that we are not able to solve the equation if we do not know the noise level. But in applications it is ad hoc unrealistic to know the error of a measurement. In practice, the error of a measurement may often be estimated through averaging of multiple measurements. We integrated that in our anlaysis and obtained convergence to the true solution, with the only assumption that the measurements are unbiased, independent and identically distributed according to an unknown distribution.

Highlights

  • The goal is to solve the ill-posed equation K x = y, where x ∈ X and y ∈ Y are elements of infinite dimensional Hilbert spaces and K is either linear and bounded with non-closed range, or compact

  • Let us stress that using multiple measurements to decrease the data error is a standard engineering practice under the name ‘signal averaging’, see, e.g., [27] for an introducing monograph or [20] for a survey article

  • [5] gives an explicit non trivial example for a convergent regularisation, without knowing the exact error level, under Gaussian white noise. We extent this to arbitrary distributions here, if one has multiple measurements

Read more

Summary

Introduction

In [8], it was first shown how to obtain optimal convergence in L2 under Gaussian white noise with a modified version of the discrepancy principle Another approach is to transfer results from the classical deterministic theory using the Ky-Fan metric, which metrises convergence in probability. [5] gives an explicit non trivial example for a convergent regularisation, without knowing the exact error level, under Gaussian white noise We extent this to arbitrary distributions here, if one has multiple measurements. After that we quickly show how to choose δnest to obtain almost sure convergence and we compare the methods numerically

A priori regularisation
The discrepancy principle
A counter example for convergence
Convergence in probability of the discrepancy principle
Almost sure convergence
Proofs without emergency case
Proofs for the emergency stop case
Proof of Corollary 3
Differentiation of binary option prices
Inverse heat equation
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call