Abstract

Stochastic optimization algorithms such as genetic algorithm (GA), particle swarm optimization (PSO) and bat algorithm (BA) carry out global optimization but typically consume substantial computational effort. On the other hand, deterministic algorithms like gradient descent converge rapidly but may get stuck in the local minima of multimodal functions. Thus, a way that couples the strengths of stochastic and deterministic optimization algorithm schemes is needed for better accuracy of final solution without getting trapped in the local minima. Bat algorithm is a recently developed swarm optimization technique which has been found to be a powerful method for multimodal optimization problems. The standard BA shows premature convergence and reduced convergence speeds under some conditions. So, a recently proposed enhanced bat algorithm (EBA) is augmented with gradient search to create a gradient enhanced bat algorithm (GEBA). This paper presents GEBA for optimization which involves post-hybridization of EBA with a gradient based local search algorithm to ensure accurate local exploration at final stages of GEBA. GEBA is tested with several test functions and found to perform very well. An automated gradient enhanced bat algorithm (AGEBA) is developed to addresses the problem of selecting good initial guess for gradient based algorithm. AGEBA is found to be an efficient algorithm requiring only one tuning error parameter thereby saving considerable manual effort.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call