Abstract

As the number of dimensions of an optimization problem increases, the process of deriving the solution becomes more complicated and difficult. Metaheuristic algorithms are effective methods for solving many optimization problems, whereas most of these algorithms perform poorly on high-dimensional problems. In this paper, a novel metaheuristic algorithm called Dynamic Stochastic Search (DSS) is proposed for high-dimensional optimization problems. To effectively execute the exploration (diversification) and exploitation (intensification) processes of DSS, a dynamic stochastic search process is designed by defining a stochastic search control factor, a search process based on the Gaussian distribution, two shrink modes inspired by the Whale Optimization Algorithm, and a balance mechanism derived from the Bat Algorithm. The proposed algorithm has the following advantages: no specific control parameters other than the population size and the maximum number of iterations, a simple structure, and less computational effort in the implementation. Analyzing the computational complexity of DSS demonstrates its simplicity. To evaluate the performance of DSS, twenty 300-dimensional and 3000-dimensional classical benchmark functions as well as thirty 100-dimensional CEC2014 benchmark functions are utilized. The statistical results prove the effectiveness and feasibility of DSS for high-dimensional optimization problems because it shows better convergence performance and higher efficiency than various advanced optimization algorithms. Moreover, the superiority of DSS in solving high-dimensional real-world optimization problems is validated by applying it to feature selection for ten real-world benchmark classification datasets from the UCI machine learning repository.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call