Abstract

<p style='text-indent:20px;'>In a Hilbert setting, we develop fast methods for convex unconstrained optimization. We rely on the asymptotic behavior of an inertial system combining geometric damping with temporal scaling. The convex function to minimize enters the dynamic via its gradient. The dynamic includes three coefficients varying with time, one is a viscous damping coefficient, the second is attached to the Hessian-driven damping, the third is a time scaling coefficient. We study the convergence rate of the values under general conditions involving the damping and the time scale coefficients. The obtained results are based on a new Lyapunov analysis and they encompass known results on the subject. We pay particular attention to the case of an asymptotically vanishing viscous damping, which is directly related to the accelerated gradient method of Nesterov. The Hessian-driven damping significantly reduces the oscillatory aspects. We obtain an exponential rate of convergence of values without assuming the strong convexity of the objective function. The temporal discretization of these dynamics opens the gate to a large class of inertial optimization algorithms.</p>

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call