Abstract

Recent development of large databases, especially those in genetics and proteomics, is pushing the development of novel computational algorithms that implement rapid and accurate search strategies. One successful approach has been to use artificial intelligence and methods, including pattern recognition (e.g. neural networks) and optimization techniques (e.g. genetic algorithms). The focus of this paper is on optimizing the design of genetic algorithms by using an adaptive mutation rate that is derived from comparing the fitness values of successive generations. We propose a novel pseudoderivative-based mutation rate operator designed to allow a genetic algorithm to escape local optima and successfully continue to the global optimum. Once proven successful, this algorithm can be implemented to solve real problems in neurology and bioinformatics. As a first step towards this goal, we tested our algorithm on two 3-dimensional surfaces with multiple local optima, but only one global optimum, as well as on the N-queens problem, an applied problem in which the function that maps the curve is implicit. For all tests, the adaptive mutation rate allowed the genetic algorithm to find the global optimal solution, performing significantly better than other search methods, including genetic algorithms that implement fixed mutation rates.

Highlights

  • The last few years have seen an exponential increase in the size of databases, especially those in genetics, which catalog the basis of various diseases

  • The challenge in solving a global optimization problem is in seeking the global optimum rather than becoming trapped in a local optimum, an issue that will be addressed in more detail later[6]

  • A genetic algorithms (GAs) with less randomness leads to faster convergence towards local optima; by limiting randomness it limits the search space, which in turns hinders the search for the global optimum

Read more

Summary

Introduction

The last few years have seen an exponential increase in the size of databases, especially those in genetics, which catalog the basis of various diseases. Small mutations are allowed to add an element of randomness, and in this way aid the genetic algorithm in finding the optimal solution[2]. Within a given search space S on the optimization function f, a global (absolute) optimum is sought This may take the form of a global maximum or minimum, depending on the original problem. The challenge in solving a global optimization problem is in seeking the global optimum rather than becoming trapped in a local optimum, an issue that will be addressed in more detail later[6]. These optimization problems can be approached with a variety of techniques. One popular technique is the use of GAs, the focus of this study[7]

Methods
Results
Discussion
Conclusion

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.