Abstract

Deep learning based reconstruction methods deliver outstanding results for solving inverse problems and are therefore becoming increasingly important. A recently invented class of learning-based reconstruction methods is the so-called NETT (for Network Tikhonov Regularization), which contains a trained neural network as regularizer in generalized Tikhonov regularization. The existing analysis of NETT considers fixed operators and fixed regularizers and analyzes the convergence as the noise level in the data approaches zero. In this paper, we extend the frameworks and analysis considerably to reflect various practical aspects and take into account discretization of the data space, the solution space, the forward operator and the neural network defining the regularizer. We show the asymptotic convergence of the discretized NETT approach for decreasing noise levels and discretization errors. Additionally, we derive convergence rates and present numerical results for a limited data problem in photoacoustic tomography.

Highlights

  • In this paper, we are interested in neural network based solutions to inverse problems of the formFind x from data yδ = Ax + η . (1)Here A is a potentially non-linear operator between Banach spaces X and Y, yδ are the given noisy data, x is the unknown to be recovered, η is the unknown noise perturbation and δ ≥ 0 indicates the noise level

  • We expect that the NETT functional will yield better results due to data consistency, which is mainly helpful outside the masked center diagonal

  • We performed numerical experiments using a limited data problem for Photoacoustic Tomography (PAT) that is the combination of an inverse problem for the wave equation and an inpainting problem

Read more

Summary

Introduction

We are interested in neural network based solutions to inverse problems of the form. A is a potentially non-linear operator between Banach spaces X and Y, yδ are the given noisy data, x is the unknown to be recovered, η is the unknown noise perturbation and δ ≥ 0 indicates the noise level. Special challenges in solving inverse problems are the non-uniqueness of the solutions and the instability of the solutions with respect to the given data. To overcome these issues, regularization methods are needed, which select specific solutions and at the same time stabilize the inversion process

Reconstruction with Learned Regularizers
Discrete NETT
Outline
Well-Posedness
Convergence
Convergence Rates
Application to a Limited Data Problem in PAT
Discrete Forward Operator
Numerical Results
Conclusions
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call