Abstract

Many researches have identified that differential evolution algorithm (DE) is one of the most powerful stochastic real-parameter algorithms for global optimization problems. However, a stagnation problem still exists in DE variants. In order to overcome the disadvantage, two improvement ideas have gradually appeared recently. One is to combine multiple mutation operators for balancing the exploration and exploitation ability. The other is to develop convergent DE variants in theory for decreasing the occurrence probability of the stagnation. Given that, this paper proposes a subspace clustering mutation operator, called SC_qrtop. Five DE variants, which hold global convergence in probability, are then developed by combining the proposed operator and five mutation operators of DE, respectively. The SC_qrtop randomly selects an elite individual as a perturbation’s center and employs the difference between two randomly generated boundary individuals as a perturbation’s step. Theoretical analyses and numerical simulations demonstrate that SC_qrtop prefers to search in the orthogonal subspace centering on the elite individual. Experimental results on CEC2005 benchmark functions indicate that all five convergent DE variants with SC_qrtop mutation outperform the corresponding DE algorithms.

Highlights

  • The classical optimization methods, frequently used in scientific application, consist of strategies based on Hessian Matrix [1] and based on Gradient [2]

  • If the derivation of an objective function cannot be calculated, it gets difficult to search the optimal solution for classical optimization methods [4]

  • The modified algorithm will employ SC qrtop mutation operator with the probability q%

Read more

Summary

Introduction

The classical optimization methods, frequently used in scientific application, consist of strategies based on Hessian Matrix [1] and based on Gradient [2]. One is to develop DE variants based on composite trial vector generation strategies. Rahnamayan et al [15] proposed an opposition-based DE, which combines an opposition-based learning method and the classical mutation operators to generate trial vectors. With the progresses of the theoretical researches on DE, some convergent DE algorithms based on mathematical theory have been proposed. In [17], Hu et al proved that the classical DE cannot converge to the global optimal set with probability 1 and proposed a convergent DE algorithm. (ii) Secondly, this paper presents a convergent DE model by combining SC qrtop mutation and the classical mutation operators of DE and gives the theoretical proof of the algorithm convergence.

Classical Differential Evolution
Subspace Clustering Mutation Operator
Convergent DE Algorithm Based on Subspace Clustering Mutation
Numerical Experiments
Discussion
Conclusion and Future Work
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call