Abstract

In this paper, we propose a new family of continuous-time optimisation algorithms based on discontinuous second-order gradient optimisation flows, with finite-time convergence guarantees to local optima, for locally strongly convex (time-varying) cost functions. To analyse our flows, we first extend a well-know Lyapunov inequality condition for finite-time stability, to the case of (time-varying) differential inclusions. We then prove the convergence of these second-order flows in finite-time. In some particular cases, we can show that the finite-time convergence can be pre-defined by the user. We propose a robustification of the flows to bounded additive uncertainties and extend some of the results to the case of constrained optimisation. We show the performance of these flows on well-known optimisation benchmarks, namely, the Rosenbrock function, and the Rastringin function.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call