Abstract

Computationally efficient algorithms for large-scale black-box optimization have become increasingly important in recent years due to the growing complexity of engineering and scientific problems. In this paper, a novel algorithm called the Self-adaptive Fast Fireworks Algorithm (SF-FWA) is proposed to effectively conduct large-scale black-box optimization. The main idea is to utilize a set of expressive and computationally efficient search distributions to cope with different function landscapes while tuning the hyperparameters of the search distributions in an online fashion. To achieve this, the Expressive Fast Explosion (EFE) mechanism is designed to achieve effective and efficient sampling, and the Inter-Fireworks Competitive Cooperation (IFCC) mechanism is designed to adapt hyperparameter distributions. This new optimization paradigm equips the population with the ability to automatically adjust to a rich set of function landscapes with linear computational complexity in terms of problem dimensionality. Experimental studies show that SF-FWA can not only exploit the separability of the problem efficiently but can also deal with rotational transformations to the coordinate system. The numerical results on the standard large-scale optimization benchmark suite indicate that SF-FWA outperforms current state-of-the-art large-scale optimization algorithms. The outstanding performance of SF-FWA on optimizing neural network controllers for solving reinforcement learning tasks demonstrates its great potential to be applied to a wider range of real-world problems.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call