Abstract

In this paper, we study a discrete momentum consensus-based optimization (Momentum-CBO) algorithm which corresponds to a second-order generalization of the discrete first-order CBO [S.-Y. Ha, S. Jin and D. Kim, Convergence of a first-order consensus-based global optimization algorithm, Math. Models Methods Appl. Sci. 30 (2020) 2417–2444]. The proposed algorithm can be understood as the modification of ADAM-CBO, replacing the normalization term by unity. For the proposed Momentum-CBO, we provide a sufficient framework which guarantees the convergence of algorithm toward a global minimum of the objective function. Moreover, we present several experimental results showing that Momentum-CBO has an improved success rate of finding the global minimum compared to vanilla-CBO and show the stability of Momentum-CBO under different initialization schemes. We also show that Momentum-CBO can be used as the alternative of ADAM-CBO which does not have a proper convergence analysis. Finally, we give an application of Momentum-CBO for Lyapunov function approximation using symbolic regression techniques.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call