Abstract

Population based global optimization methods can be extended by properly defined networks in order to explore the structure of the search space, to describe how the method performed on a given problem and to inform the optimization algorithm so that it can be more efficient. The memetic differential evolution (MDE) algorithm using local optima network (LON) is investigated for these aspects. Firstly, we report the performance of the classical variants of differential evolution applied for MDE, including the structural properties of the resulting LONs. Secondly, a new restarting rule is proposed, which aims at avoiding early convergence and it uses the LON which is built-up during the evolutionary search of MDE. Finally, we show the promising results of this new rule, which contributes to the efforts of combining optimization methods with network science.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.