Abstract

Several population-based metaheuristic optimization algorithms have been proposed in the last decades, none of which are able either to outperform all existing algorithms or to solve all optimization problems according to the No Free Lunch (NFL) theorem. Many of these algorithms behave effectively, under a correct setting of the control parameter(s), when solving different engineering problems. The optimization behavior of these algorithms is boosted by applying various strategies, which include the hybridization technique and the use of chaotic maps instead of the pseudo-random number generators (PRNGs). The hybrid algorithms are suitable for a large number of engineering applications in which they behave more effectively than the thoroughbred optimization algorithms. However, they increase the difficulty of correctly setting control parameters, and sometimes they are designed to solve particular problems. This paper presents three hybridizations dubbed HYBPOP, HYBSUBPOP, and HYBIND of up to seven algorithms free of control parameters. Each hybrid proposal uses a different strategy to switch the algorithm charged with generating each new individual. These algorithms are Jaya, sine cosine algorithm (SCA), Rao’s algorithms, teaching-learning-based optimization (TLBO), and chaotic Jaya. The experimental results show that the proposed algorithms perform better than the original algorithms, which implies the optimal use of these algorithms according to the problem to be solved. One more advantage of the hybrid algorithms is that no prior process of control parameter tuning is needed.

Highlights

  • It is well known that metaheuristic optimization methods are widely used to solve problems in several fields of science and engineering

  • Three hybrid algorithms free of setting parameters dubbed the HYBSUBPOP, HYBPOP, and HYBIND algorithms are designed. These algorithms are derived from a dynamic skeleton allowing the inclusion of any metaheuristic optimization algorithm that exhibits further improvements

  • The only requirement in merging a new optimization algorithm into the proposed skeleton is to know if the replacement of an individual on that algorithm is based on the enhancement of the cost function or not

Read more

Summary

Introduction

It is well known that metaheuristic optimization methods are widely used to solve problems in several fields of science and engineering. Each optimization method proposes its own rules for the evolution of the population towards the optimum These algorithms are suitable for general problems, but each one has different skills in global exploration and local exploitation. Nonlinear dynamic systems that are characterized by a high sensitivity to their initial conditions are studied in chaos theory [19,20] They can be applied to replace the PRNGs in producing the control parameters or performing local searches [21,22,23,24,25,26,27,28,29,30,31,32,33,34].

Preliminaries
52: Search for the current BestPop
23: Search for the current BestPop
Hybrid Algorithms
16: Search for the current BestPop
17: Search for the current BestPop
Numerical Experiments
Findings
Conclusions
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call