Abstract

As a nature-inspired search algorithm, the Firefly algorithm (being a naturally outstanding search algorithm with few control parameters) may have a considerable influential performance. In this paper, we present a new firefly algorithm to address the parameter selection and adaptation strategy in the standard firefly algorithm. The proposed firefly algorithm introduces a modified exploration and exploitation mechanism, with adaptive randomness and absorption coefficients. The proposed method employs the adaptation of the randomness and absorption coefficients to be a function of time/iterations. Moreover, gray relational analysis advancing fireflies is used to allocate different information from appealing ones effectively. Standard benchmark functions are applied to verify the effects of these improvements and it is illustrated that, in most situations, the performance of the proposed firefly algorithm is superior to (or at least highly competitive with) the standard firefly algorithm, and state-of-the-art approaches in terms of performance.

Highlights

  • Optimization is something that is involved in all our activities

  • The proposed method is exhibited to be effective in getting satisfactory results since it aids in providing an equilibrium on exploitation and exploration abilities

  • We present the performance of the proposed firefly algorithm with other optimization approaches, such as Standard particle swarm optimization (SPSO) [24], Differential Evolution (DE) [9], [25] Variable Step Size Firefly Algorithm (VSSFA) [26], Memetic Firefly

Read more

Summary

Introduction

Optimization is something that is involved in all our activities. Everything from the simple decision of what time to leave for work all the way to the more complex decisions such as how to budget a daily cost of living allowance requires optimizing procedures. The process of optimization is finding an optimal solution for a function. All probable values can be obtained solutions and the optimal solution is assumed to be the extreme value. There are two categories for optimization algorithms. Deterministic methods include classical optimization modes such as Golden mean, Newton method, modified Newton method, gradient method, along with

Objectives
Methods
Discussion
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call