Abstract

Recently, several new initialization techniques have been proposed. Despite their good results in very low dimensions, their performance deteriorated significantly with an increase in the number of dimensions. This paper introduces a new method for initializing metaheuristic algorithms, based on the Gibbs sampling approach. The proposed method samples according to the Gibbs method, a multidimensional Gaussian that completely covers the search space defined by the objective function. In this process, each decision variable is sequentially sampled to ensure that the value of each is dependent only on the value of the variable sampled before it. This process leads to the generation of initial positions with significantly low mutual correlation, which is an advantageous feature that prevents the aggregation of initial solutions in specific areas of the search space. This issue is particularly prevalent in optimization problems with a greater number of dimensions. To test the effectiveness of this method, it was applied in conjunction with a Differential Evolution algorithm. The complete approach has been evaluated using a selection of pertinent and challenging functions in the field. The outcome of these experiments showed that the algorithm can establish a superior set of initial solutions, which allows consistent determination of the global solution, even as the complexity of the problem increases with more dimensions.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.