Abstract

In a Hilbert space , we develop fast convex optimization methods which are based on an evolution system of the third-order in time. The function to minimize is convex, continuously differentiable, with , and enters the dynamic via its gradient. On the basis of Lyapunov's analysis and temporal scaling techniques, we show a convergence rate of the values of the order , and obtain the convergence of the trajectories towards optimal solutions. When f is strongly convex, an exponential rate of convergence is obtained. We complete the study of the continuous dynamic by introducing a damping term induced by the Hessian of f. This allows the oscillations to be controlled and attenuated. Then, we analyse the convergence of the proximal-based algorithms obtained by temporal discretization of this system, and obtain similar convergence rates. The algorithmic results are valid for a general convex, lower semicontinuous and proper function .

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call