Abstract
Optimization for all disciplines is very important and applicable. Optimization has played a key role in practical engineering problems. A novel hybrid meta-heuristic optimization algorithm that is based on Differential Evolution (DE), Gradient Evolution (GE) and Jumping Technique named Differential Gradient Evolution Plus (DGE+) are presented in this paper. The proposed algorithm hybridizes the above-mentioned algorithms with the help of an improvised dynamic probability distribution, additionally provides a new shake off method to avoid premature convergence towards local minima. To evaluate the efficiency, robustness, and reliability of DGE+ it has been applied on seven benchmark constraint problems, the results of comparison revealed that the proposed algorithm can provide very compact, competitive and promising performance.
Highlights
Optimization for all disciplines is very important and applicable
A novel hybrid meta-heuristic optimization algorithm that is based on Differential Evolution (DE), Gradient Evolution (GE) and Jumping Technique named Differential Gradient Evolution Plus (DGE+) are presented in this paper
Robustness, and reliability of DGE+ it has been applied on seven benchmark constraint problems, the results of comparison revealed that the proposed algorithm can provide very compact, competitive and promising performance
Summary
Gradient evolution (GE) is an optimization algorithm based on the concept of gradients. The vector updating operator was driven from the Tylor series expansion and transforms the updating law for population-based search. The vector jumping operator prevents local optima and the vector refreshing operator is implemented in multiple iterations when a vector cannot move to a different location.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have