Abstract

Different methods have been created to make optimization processes more efficient and effective, which is a big step forward in the area of evolutionary optimization. This abstract talks about four well-known methods: Neuroevolution of Augmenting Topologies (NEAT), Genetic Algorithms (GAs), Genetic Programming (GP), and Advanced Neuroevolutionary Genetic Algorithm (ANGA). It focuses on important performance indicators like Fitness Metrics, Generalization, Efficiency and Speed, and Overall Performance. With scores of 90% in Fitness Metrics and 88% in Generalization, NEAT, a neuroevolutionary program, shows strong success in competitive tasks. With an 80% score, it does poorly in Efficiency and Speed, though. GAs are known for using a population-based method. They do very well in Efficiency and Speed (90%), but they do a little worse in Fitness Metrics and Generalization (89% and 85%, respectively). With a focus on updated computer programs, GP gets marks that are equal in Fitness Metrics, Generalization, and Efficiency and Speed (88%, 85%, and 85%, respectively). The new ANGA algorithm stands out as a top worker, doing exceptionally well in all tests. ANGA gets great marks of 93% in Fitness Metrics, 94% in Generalization, and 93% in Efficiency and Speed. This shows how well it can optimize everything. Overall Performance score of 97.78% shows how well it works as a whole, making ANGA a potential method for genetic optimization.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call