Abstract

In a Hilbert space $\mathcal H$, assuming $(\alpha_k)$ a general sequence of nonnegative numbers, we analyze the convergence properties of the inertial forward-backward algorithm $(IFB)\{\begin{array}{l} y_k=x_k+\alpha_k(x_k-x_{k-1}), x_{k+1}={\rm prox}_{s\Psi}(y_k-s\nabla \Phi(y_k)) \end{array},$ where $\Psi: \mathcal H \to \mathbb R \cup \lbrace + \infty \rbrace $ is a proper lower-semicontinuous convex function, and $\Phi: \mathcal H \to \mathbb R$ is a differentiable convex function, whose gradient is Lipschitz continuous. Various options for the sequence $(\alpha_k)$ are considered in the literature. Among them, the Nesterov choice leads to the FISTA algorithm and accelerates convergence from $\mathcal{O}(1/k)$ to $\mathcal{O}(1/k^2)$ for the values. Several variants are used to guarantee the convergence of the iterates or to improve the rate of convergence for the values. For the design of fast optimization methods, the tuning of the sequence $(\alpha_k)$ is a subtle issue, which we deal with in this paper in general. We show that the convergence rate of the algorithm can be obtained simply by analyzing the sequence of positive real numbers $(\alpha_k)$. In addition to the case $\alpha_k= 1 -\frac{\alpha}{k} $ with $\alpha\geq 3$, our results apply equally well to $\alpha_k = 1- \frac{\alpha}{k^r}$, with an exponent $0<r<1$, and to Polyak's heavy ball method. Thus, we unify most of the existing results based on the accelerated gradient method of Nesterov. In the process, we improve some of them and discover new ones.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call