Abstract

In this paper, we present a novel adaptive differential evolution algorithm for global optimization problems by introducing an enhanced mutation operator and a new mixed control parameter setting based on dual performance evaluation metrics. To further strengthen the search efficiency of algorithm and availably balance its exploration and exploitation, a dual performance metrics-based mutation operator is first proposed to implement the search of population by organically integrating the fitness value and history update of individual to find the potential promising areas and allocate the suitable search resources for them. Meanwhile, a dual performance metrics-based mixed parameter setting is developed to yield appropriate associated parameters for each individual by comprehensively measuring its search characteristic and requirement based on both its fitness value and history update. In addition, a new restart strategy is further put forward to boost the search performance of algorithm by reasonably replacing the meaningless individuals measured by both their fitness values and history updates with the randomly generated individuals based on Gaussian walk. In contrast to the existing DE versions, the new algorithm organically takes advantage of the fitness value and history update of individual to assign the proper computational resources for each potential promising region, create the suitable parameters for different individuals, and eliminate the valueless individuals from population. Thereby, it is capable of heightening the search efficiency of algorithm and maintaining a good balance between exploration and exploitation effectively. At last, the performance of the proposed algorithm is evaluated by comparing with 19 typical or up-to-date algorithms on 42 benchmark functions from both IEEE CEC2017 and CEC2022 test suites. Compared to these opponents, the proposed algorithm achieves significantly better performance on 60 out of 77 cases based on the multiproblem Wilcoxon signed-rank test at a significant level of 0.05, and thus is a more promising optimizer.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call