Abstract
In recent years there have been independent developments in multiple branches of Evolutionary Computation (EC) that interpret population-based and model-based search algorithms in terms of information geometric concepts. This trend has resulted in the development of novel algorithms and in improved understanding of existing ones. This tutorial aims at making this new line of research accessible to a broader range of researchers. A statistical model, identified by a parametric family of distributions, is equipped with an intrinsic (Riemannian) geometry, the so-called information geometry. From this perspective, a statistical model is a manifold of distributions where the inner product is given by the Fisher information metric. Any evolutionary algorithm that implicitly or explicitly evolves the parameters of a search distribution defines a dynamic over the manifold. Taking into account the Riemannian geometry of the new search space given by the search distributions allows for the description and analysis of evolutionary operators in a new light. Notably, this framework can be used for the study of optimization algorithms. A core idea of several recent and novel heuristics, both in the continuous and the discrete domain, such as Estimation of Distribution Algorithms (EDAs) and Natural Evolution Strategies (NESs), is to perform stochastic gradient descent directly on the space of search distributions. However the definition of gradient depends on the metric, which is why it becomes fundamental to consider the information geometry of the space of search distributions. Despite being equivalent to classical gradient-based methods for a stochastically relaxed problem the approach performs randomized direct search on the original search space: the generation of an offspring population as well as selection and strategy adaptation turn out to implicitly sample a search distribution in a statistical model and to perform a stochastic gradient step in the direction of the natural gradient. Particular strengths of the information geometric framework are its ability to unify optimization in discrete and continuous domains as well as the traditionally separate processes of optimization and strategy parameter adaptation. Respecting the intrinsic information geometry automatically results in powerful invariance principles. The framework can be seen as an analysis toolbox for existing methods, as well as a generic design principle for novel algorithms. This tutorial will introduce from scratch the mathematical concept of information geometry to the EC community. It will transport not only rigorous definitions but also geometric intuition on Riemannian geometry, information geometry, natural gradient, and stochastic gradient descent. Stochastic relaxations of EC problems will act as a glue. The framework will be made accessible with applications to basic as well as state-of-the-art algorithms operating on discrete and continuous domains.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.