Abstract

This paper proposes a multi-scale search algorithm for solving global optimization problems given a finite number of function evaluations. We refer to this algorithm as the Naive Multi-scale Search Optimization (NMSO). NMSO looks for the optimal solution by optimistically partitioning the search space over multiple scales in a hierarchical fashion. Based on a weak assumption about the function smoothness, we present a theoretical analysis on its finite-time and asymptotic convergence. An empirical assessment of the algorithm has been conducted on the noiseless Black-Box Optimization Benchmarking (BBOB) testbed and compared with the state-of-the-art optimistic as well as stochastic algorithms. Moreover, the efficacy of NMSO has been validated on the black-box optimization competition within the GECCO’15 conference where it has secured the third place out of twenty-eight participating algorithms. Overall, NMSO is suitable for problems with limited function evaluations, low-dimensionality search space, and objective functions that are separable or multi-modal. Otherwise, it is comparable with the top performing algorithms.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call