Abstract
Estimation of Distribution Algorithms (EDAs) maintain and iteratively update a probabilistic model to tackle optimization problems. The Boltzmann Probability Distribution Function (Boltzmann-PDF) provides advantages when used in energy based EDAs. However, direct sampling from the Boltzmann-PDF to update the probabilistic model is unpractical, and several EDAs employ an approximation to the Boltzmann-PDF by means of a Gaussian distribution that is usually derived by the minimization of the Kullback-Leibler divergence (KL-divergence) computed between the Gaussian and the Boltzmann-PDFs. The KL-divergence measure is not symmetric, and this causes the Gaussian approximation to fail at correctly modeling the target function for the EDAs, because the parameters of the Gaussian are not optimally estimated. In this paper, we derive an approximation to the Boltzmann-PDF using Jeffreys' divergence (a symmetric measure) in lieu of the KL-divergence and thus improve the performance of the optimization algorithm. Our approach is termed Symmetric-approximation Energy-based Estimation of Distribution (SEED) algorithm. The SEED algorithm is experimentally compared under a univariate approach against two other EDAs (UMDAc and BUMDA) on several benchmark optimization problems. The results show that the SEED algorithm is more effective and more efficient than the other algorithms.
Highlights
Computational optimization refers to methods for the selection of a best solution to a given problem that has been mathematically modeled
An optimization algorithm requires a mechanism for producing increasingly better candidate solutions, guided by a so-called objective function that quantifies the merit of each possible solution
A large variety of optimization problems exists [1], [2], which can be divided into categories, such as: discretevariable, continuous-variable, combinatorial optimization problems [3], multi-objective optimization problems [4], constraint satisfaction problems, nonlinear-programming problems [5], etc
Summary
Computational optimization refers to methods for the selection of a best solution to a given problem that has been mathematically modeled. A particular type of evolutionary algorithms are the Estimation of Distribution Algorithms (EDAs) These metaheuristics build explicit probabilistic models that are iteratively refined to produce increasingly better solutions for a target problem. The general idea is to determine the Gaussian parameters (μ, ν) that minimize the J-divergence (4) and use that distribution as the probabilistic model in a modified EDA, which we call the Symmetric-approximation Energybased Estimation of Distribution (SEED) algorithm. 2.- As a second contribution, we derive a minimumvariance estimator for β by minimizing the variance of the Gaussian distribution with respect to this parameter This way of addressing the problem results in a self-adaptive mechanism to compute β for each particular fitness function.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.