Abstract

Optimization for all disciplines is very important and applicable. Optimization has played a key role in practical engineering problems. A novel hybrid meta-heuristic optimization algorithm that is based on Differential Evolution (DE), Gradient Evolution (GE) and Jumping Technique named Differential Gradient Evolution Plus (DGE+) are presented in this paper. The proposed algorithm hybridizes the above-mentioned algorithms with the help of an improvised dynamic probability distribution, additionally provides a new shake off method to avoid premature convergence towards local minima. To evaluate the efficiency, robustness, and reliability of DGE+ it has been applied on seven benchmark constraint problems, the results of comparison revealed that the proposed algorithm can provide very compact, competitive and promising performance.

Highlights

  • Optimization for all disciplines is very important and applicable

  • A novel hybrid meta-heuristic optimization algorithm that is based on Differential Evolution (DE), Gradient Evolution (GE) and Jumping Technique named Differential Gradient Evolution Plus (DGE+) are presented in this paper

  • Robustness, and reliability of DGE+ it has been applied on seven benchmark constraint problems, the results of comparison revealed that the proposed algorithm can provide very compact, competitive and promising performance

Read more

Summary

Gradient evolution algorithm

Gradient evolution (GE) is an optimization algorithm based on the concept of gradients. The vector updating operator was driven from the Tylor series expansion and transforms the updating law for population-based search. The vector jumping operator prevents local optima and the vector refreshing operator is implemented in multiple iterations when a vector cannot move to a different location.

Differential gradient evolution plus
Parameter selection
Population size Ps
Number of generations GN
Gradient evolution parameter gamma γ
Differential evolution parameter scale factor SF
Differential evolution parameter crossover rate CR
Selection probability SP
Sub-optimal solution acceptance rate AR
Refresh rate RR
Shake off threshold ST
Constraint handling
Rule 4
Experiments on constrained optimization problems
Constrained problem 2
Constrained problem 3
Constrained problem 4
Methods
Constrained problem 6
Constrained problem 7
Findings
Conclusions
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call