Abstract

Vegetation evolution (VEGE) is a newly proposed meta-heuristic algorithm (MA) with excellent exploitation but relatively weak exploration capacity. We thus focus on further balancing the exploitation and the exploration of VEGE well to improve the overall optimization performance. This paper proposes an improved Q-learning based VEGE, and we design an exploitation archive and an exploration archive to provide a variety of search strategies, each archive contains four efficient and easy-implemented search strategies. In addition, online Q-Learning, as well as ε-greedy scheme, are employed as the decision-maker role to learn the knowledge from the past optimization process and determine the search strategy for each individual automatically and intelligently. In numerical experiments, we compare our proposed QVEGE with eight state-of-the-art MAs including the original VEGE on CEC2020 benchmark functions, twelve engineering optimization problems, and wireless sensor networks (WSN) coverage optimization problems. Experimental and statistical results confirm that the proposed QVEGE demonstrates significant enhancements and stands as a strong competitor among existing algorithms. The source code of QVEGE is publicly available at https://github.com/RuiZhong961230/QVEGE.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call