Abstract

Population based global optimization methods can be extended by properly defined networks in order to explore the structure of the search space, to describe how the method performed on a given problem and to inform the optimization algorithm so that it can be more efficient. The memetic differential evolution (MDE) algorithm using local optima network (LON) is investigated for these aspects. Firstly, we report the performance of the classical variants of differential evolution applied for MDE, including the structural properties of the resulting LONs. Secondly, a new restarting rule is proposed, which aims at avoiding early convergence and it uses the LON which is built-up during the evolutionary search of MDE. Finally, we show the promising results of this new rule, which contributes to the efforts of combining optimization methods with network science.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call