Abstract Nesterov’s acceleration strategy is renowned in speeding up the convergence of gradient-based optimization algorithms and has been crucial in developing fast first order methods for well-posed convex optimization problems. Although Nesterov’s accelerated gradient method has been adapted as an iterative regularization method for solving ill-posed inverse problems, no general convergence theory is available except for some special instances. In this paper, we develop an adaptive Nesterov momentum method for solving ill-posed inverse problems in Banach spaces, where the step-sizes and momentum coefficients are chosen through adaptive procedures with explicit formulas. Additionally, uniform convex regularization functions are incorporated to detect the features of sought solutions. Under standard conditions, we establish the regularization property of our method when terminated by the discrepancy principle. Various numerical experiments demonstrate that our method outperforms the Landweber-type method in terms of the required number of iterations and the computational time.
Read full abstract