Abstract

Metaheuristic algorithms are widely used for optimization in both research and the industrial community for simplicity, flexibility, and robustness. However, multi-modal optimization is a difficult task, even for metaheuristic algorithms. Two important issues that need to be handled for solving multi-modal problems are (a) to categorize multiple local/global optima and (b) to uphold these optima till the ending. Besides, a robust local search ability is also a prerequisite to reach the exact global optima. Grey Wolf Optimizer (GWO) is a recently developed nature-inspired metaheuristic algorithm that requires less parameter tuning. However, the GWO suffers from premature convergence and fails to maintain the balance between exploration and exploitation for solving multi-modal problems. This study proposes a niching GWO (NGWO) that incorporates personal best features of PSO and a local search technique to address these issues. The proposed algorithm has been tested for 23 benchmark functions and three engineering cases. The NGWO outperformed all other considered algorithms in most of the test functions compared to state-of-the-art metaheuristics such as PSO, GSA, GWO, Jaya and two improved variants of GWO, and niching CSA. Statistical analysis and Friedman tests have been conducted to compare the performance of these algorithms thoroughly.

Highlights

  • The area of nature-inspired metaheuristic algorithms is continuously evolving with newly developed algorithms

  • We evaluated the proposed algorithm’s performance comprehensively for 23 test functions as a minimization problem [13,18]

  • The Grey Wolf Optimizer (GWO) algorithm was influenced by the leadership structure and group hunting system of wolves

Read more

Summary

Introduction

The area of nature-inspired metaheuristic algorithms is continuously evolving with newly developed algorithms. These algorithms are well known for their widespread local and global search ability, local optima avoidance, and quick convergence ability. These algorithms do not need any knowledge of a function’s gradient or differentiability [1,2]. Since the metaheuristic algorithms have these superior properties with easy applicability and fewer parameter requirements, significant variants of these algorithms are developed [3]. The most widely used algorithms are Simulated Annealing (SA) [4], Genetic Algorithm (GA) [5], Particle Swarm Optimization (PSO) [6], Differential. A few recent metaheuristics that seek the attention of researchers is Bee Collecting Pollen Algorithm (BCPA) [9], Black

Methods
Results
Discussion
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call