Abstract

In this paper, we mainly consider a class of nonconvex nonsmooth optimization problems, whose objective function is the sum of a smooth function with a Lipschitz continuous gradient and a convex nonsmooth function. We first propose a proximal stochastic recursive momentum algorithm(ProxSTORM) with mini-batch for solving the optimization problems and consider its convergence behaviour. Then, based on the Polyak–Łojasiewicz inequality, we establish the global linear convergence rate of ProxSTORM. Finally, some numerical experiments have been conducted to illustrate the efficiency of our method.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call