Up to now, different algorithmic methods have been developed to find ’good’ solutions (near to the optimal solution) for global optimization problems that cannot be solved using analytical methods. In particular, algorithmic methods such as differential evolution, evolutionary algorithms, and hill climbing belong to the class of Stochastic Global Optimization Algorithms (SGoals). In general, an SGoal iteratively applies stochastic operations to a set of candidate solutions (population), following, in some cases, a heuristic/metaheuristic. Although some research works have tried to formalize SGoals using Markov kernels, such formalization is not general and sometimes is blurred. In this paper, we propose a comprehensive, systematic and formal approach for studying SGoals. First, we present concepts of probability theory that are required to perform such formalization and we demonstrate that the swapping, sorting, and permutation functions, among others, can be represented by Markov kernels. Then, we introduce the joint Markov kernel as a way of characterizing the combination of stochastic methods. Next, we define the optimization space, a σ-algebra that contains ϵ-optimal states, and develop Markov kernels for stochastic methods like swapping, sorting and permutation. Finally, we introduce sufficient convergence conditions for SGoals and present some popular SGoals in terms of the developed theory.
Read full abstract