Abstract

One of the most difficult challenges in human genetics is the identification and characterization of susceptibility genes for common complex human diseases. The presence of gene-gene and gene-environment interactions comprising the genetic architecture of these diseases presents a substantial statistical challenge. As the field pushes toward genome-wide association studies with hundreds of thousands, or even millions, of variables, the development of novel statistical and computational methods is a necessity. Previously, we introduced a grammatical evolution optimized NN (GENN) to improve upon the trial-and-error process of choosing an optimal architecture for a pure feed-forward back propagation neural network. GENN optimizes the inputs from a large pool of variables, the weights, and the connectivity of the network - including the number of hidden layers and the number of nodes in the hidden layer. Thus, the algorithm automatically generates optimal neural network architecture for a given data set. Like all evolutionary computing algorithms, grammatical evolution relies on evolutionary operators like crossover and selection to learn the best solution for a given dataset. We wanted to understand the effect of fitness proportionate versus ordinal selection schemes, and the effect of standard and novel crossover strategies on the performance of GENN.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call