Abstract

We extend the globally convergent TIGRA method in Ramlau (2003 Inverse Prob. 19 433–65) for the computation of a minimizer of the Tikhonov-type functional with the convex () penalty terms Θ for nonlinear forward operators in Banach spaces. The Θ are allowed to be non-smooth to include or TV (total variation) functionals, which are significant in reconstructing special features of solutions such as sparsity and discontinuities. The proposed TIGRA-Θ method uses a dual gradient descent method in the inner iteration and linearly decreases the regularization parameter in the outer iteration. We present the global convergence analysis for the algorithm under suitable parameter selections, and the convergence rate results are provided under both a priori and a posteriori stopping rules. Two numerical examples—an auto-convolution problem and a parameter identification problem—are presented to illustrate the theoretic analysis and verify the effectiveness of the method.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.