Abstract
We develop a computationally efficient algorithm for the automatic regularization of nonlinear inverse problems based on the discrepancy principle. We formulate the problem as an equality constrained optimization problem, where the constraint is given by a least squares data fidelity term and expresses the discrepancy principle. The objective function is a convex regularization function that incorporates some prior knowledge, such as the total variation regularization function. Using the Jacobian matrix of the nonlinear forward model, we consider a sequence of quadratically constrained optimization problems that can all be solved using the Projected Newton method. We show that the solution of such a quadratically constrained sub-problem results in a descent direction for an exact merit function. This merit function can then be used to describe a formal line-search method. We also formulate a slightly more heuristic approach that simplifies the algorithm and allows for an inexact solution of the sequence of sub-problems. We illustrate the behavior of the algorithm using a number of numerical experiments, with Talbot-Lau x-ray phase contrast imaging as the main application. The numerical experiments confirm that the quadratically constrained sub-problems need not be solved with high accuracy in early iterations to make sufficient progress towards the solution. In addition, we show that the proposed method is able to produce reconstructions of similar quality compared to other state-of-the-art approaches with a significant reduction in computational time.
Highlights
In this manuscript we consider the regularized nonlinear least squares problem min Y(x) subject to 1 s (x) - b 2 s2 (1) ⎛ ¶s1(x) J (x) = ⎛ ¶si (x) ⎞ ⎜ ⎝ ¶x j ⎟ ⎠i,j
We show that the solution of such a quadratically constrained sub-problem results in a descent direction for an exact merit function
To study the performance of the Sequential Projected Newton method for different choices of ζ we report some actual timings in table 1
Summary
Sequential Projected Newton method for regularization of nonlinear least squares problems. Commons Attribution 4.0 inverse problems based on the discrepancy principle. Optimization problems that can all be solved using the Projected Newton method. We show that the solution of such a quadratically constrained sub-problem results in a descent direction for an exact merit function. This merit function can be used to describe a formal line-search method. We formulate a slightly more heuristic approach that simplifies the algorithm and allows for an inexact solution of the sequence of sub-problems. The numerical experiments confirm that the quadratically constrained sub-problems need not be solved with high accuracy in early iterations to make sufficient progress towards the solution. We show that the proposed method is able to produce reconstructions of similar quality compared to other state-of-the-art approaches with a significant reduction in computational time
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.