Abstract

We consider the ill-posed inverse problem of identifying a nonlinearity in a time-dependent partial differential equation model. The nonlinearity is approximated by a neural network (NN), and needs to be determined alongside other unknown physical parameters and the unknown state. Hence, it is not possible to construct input–output data pairs to perform a supervised training process. Proposing an all-at-once approach, we bypass the need for training data and recover all the unknowns simultaneously. In the general case, the approximation via a NN can be realized as a discretization scheme, and the training with noisy data can be viewed as an ill-posed inverse problem. Therefore, we study discretization of regularization in terms of Tikhonov and projected Landweber methods for discretization of inverse problems, and prove convergence when the discretization error (network approximation error) and the noise level tend to zero.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call