Abstract

We address discrete nonlinear inverse problems with weighted least squares and Tikhonov regularization. Regularization is a way to add more information to the problem when it is ill-posed or ill-conditioned. However, it is still an open question as to how to weight this information. The discrepancy principle considers the residual norm to determine the regularization weight or parameter, while the $\chi^2$ method [J. Mead, J. Inverse Ill-Posed Probl., 16 (2008), pp. 175--194; J. Mead and R. A. Renaut, Inverse Problems, 25 (2009), 025002; J. Mead, Appl. Math. Comput., 219 (2013), pp. 5210--5223; R. A. Renaut, I. Hnetynkova, and J. L. Mead, Comput. Statist. Data Anal., 54 (2010), pp. 3430--3445] uses the regularized residual. Using the regularized residual has the benefit of giving a clear $\chi^2$ test with a fixed noise level when the number of parameters is equal to or greater than the number of data. Previous work with the $\chi^2$ method has been for linear problems, and here we extend it to nonlinear problems. In particular, we determine the appropriate $\chi^2$ tests for Gauss--Newton and Levenberg--Marquardt algorithms, and these tests are used to find a regularization parameter or weights on initial parameter estimate errors. This algorithm is applied to a two-dimensional cross-well tomography problem and a one-dimensional electromagnetic problem from [R. C. Aster, B. Borchers, and C. Thurber, Parameter Estimation and Inverse Problems, Academic Press, New York, 2005].

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call