Abstract

Many-parameter optimization remains hard, especially when presented with a complex, rugged, non-differentiable optimization landscape. The engineering of stochastic black box optimization methods, particularly evolutionary algorithms (EAs), represents the most common and successful approach to trying to solve such problems, and currently several strategies are being explored to improve performance when the number of parameters is large (in the region of 1000 parameters is now typical). Prominent among these techniques are variants of differential evolution, while one of the main algorithm engineering strategies being explored is the concept of 'co-operative co-evolution' (CC), which involves successively optimizing subsets of the design parameters, with an organized approach occasionally reconciling these 'subspace' optimizations. Recent work has shown that combining CC with fitness inheritance (FI) - a technique heretofore rarely explored in the context of large-scale optimization - can reliably lead to faster and better performance. However that work was done in the context of a simple underlying EA (allowing us to be more confident that the benefits were due primarily to the combination of CC and FI). Here we explore the extent to which CC and FI provides added value when engineered together in the context of more sophisticated, so-called state of the art underlying algorithms, pre-adorned with a variety of additional enhancements. To that end, in this paper we explore SaNSDE, and DECC-DML - two recent high-performance techniques in the field of large-scale optimization. We also explore two basic adaptive parameter setting strategies for the FI component. We find that engineering FI (and CC, where it otherwise wasn't) into these algorithms can provides either competitive or improved results.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call