Objective: The objective is to present a new method to explore BFGS where a new decision-making flow was added in order to optimize and computationally improve the performance of the Genetic Algorithm, controlling the value of the gradient as it evolves. Theoretical Framework: In the literature, the work presented by [Chen, 2014] stands out, who proposed that there must be a repulsive contribution to ensure the stability of the structure, which is introduced as the interaction between atomic pairs that makes up the second term of the cohesion energy. [Goldberg,1989] proposed the use of a genetic algorithm to search for available minima in the potential energy function associated with clusters. Method: The methodology adopted in this research was the use of Evolutionary Algorithms (EAs) are inspired by the principle of natural selection and survival of the fittest individuals, discovered by physiologist Charles Darwin and described in his book "The Origin of Species", from 1859. Among various selection methods, the roulette wheel method was used, as it is useful in problems where diversity needs to be maintained to avoid premature convergence. The crossover used by Deaven and Ho, which is a technique known as the cut-and-splice method, applied in the context of optimizing molecular structures or atomic clusters and the aluminum and magnesium clusters that were explored using a Genetic algorithm applied to Gupta potential function and tuned to look for the lowest energy minima for each cluster size. Results and Discussion: The results obtained revealed [synthesize the main results of the research]. In the discussion section, these results are contextualized in light of the theoretical framework, highlighting the implications and relationships identified. Possible discrepancies and limitations of the study are also considered in this section. Research Implications: The objective of this article was to present a new method to explore BFGS where a new decision-making flow was added in order to optimize and computationally improve the performance of the Genetic Algorithm, controlling the value of the gradient as it evolves. This new method proved to be very promising according to the results presented in the previous section, since the Sequential Genetic Algorithm, which becomes quite costly as it invokes the BFGS optimization method, always has a very small Gradient vector error, unlike the Genetic Algorithm which It starts with a very low value and is gradually increased according to established criteria. In experiments carried out with both 10 seeds and 30 seeds, the Genetic Algorithm obtained better energies in the general average and the standard deviation was also much better, reaching better global minimums. Originality/Value: This study contributes to the literature by creating new flows in the Genetic Algorithm, enabling computational gains in the processing of experiments, which was proven in the results obtained.
Read full abstract