Abstract

In this study, a new parameter control scheme is proposed for the differential evolution algorithm. The developed linear bias reduction scheme controls the Lehmer mean parameter value depending on the optimization stage, allowing the algorithm to improve the exploration properties at the beginning of the search and speed up the exploitation at the end of the search. As a basic algorithm, the L-SHADE approach is considered, as well as its modifications, namely the jSO and DISH algorithms. The experiments are performed on the CEC 2017 and 2020 bound-constrained benchmark problems, and the performed statistical comparison of the results demonstrates that the linear bias reduction allows significant improvement of the differential evolution performance for various types of optimization problems.

Highlights

  • The Computational Intelligence (CI) methods include a variety of approaches, such as EvolutionaryComputation (EC), Fuzzy Logic (FL), and Neural Networks (NN)

  • The parameter tuning mechanism should rely on the search process quality, usually fitness values and their improvement, and for a tuning scheme to be efficient, it should be designed with respect to the algorithm properties, which leads to many different adaptation schemes developed for specific algorithms

  • The evaluation of the linear bias reduction approach was performed on two sets of benchmark problems, in particular the CEC 2017 [5] and CEC 2020 [42] competitions on bound-constrained single-objective optimization

Read more

Summary

Introduction

The Computational Intelligence (CI) methods include a variety of approaches, such as EvolutionaryComputation (EC), Fuzzy Logic (FL), and Neural Networks (NN). The Computational Intelligence (CI) methods include a variety of approaches, such as Evolutionary. Algorithms (EA) and Swarm Intelligence (SI), are usually developed by introducing new algorithmic schemes [1] or parameter control or adaptation techniques [2]. The results of the annual competitions on numerical optimization, conducted within the IEEE Congress on Evolutionary Computation, show [5] that in the last few years, the winners have been mostly DE-based frameworks, which include hybridization or novel parameter tuning techniques. The parameter tuning mechanism should rely on the search process quality, usually fitness values and their improvement, and for a tuning scheme to be efficient, it should be designed with respect to the algorithm properties, which leads to many different adaptation schemes developed for specific algorithms. Several recent surveys considered the existing variants of DE and their properties [6,7]

Methods
Results
Discussion
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call