Abstract

This report describes a GA (Genetic Algorithms) method that evolves multi-layered feedforward neural network architectures for specific mappings. The network is represented as a genotype that has six kinds of genes. They are a learning rate, a slant of sigmoid function, a coefficient of momentum term, an initializing weights range, the number of layers and the unit numbers of each layer. Genetic operators affect populations of these genotypes to produce adaptive networks with higher fitness values. We define three kinds of fitness functions that evaluate networks generated by the GA method. Their fitnesses are assessed for the generated network trained with BP (Back Propagation) algorithm by several network performances. In our experiments, we train the networks for the XOR mapping. They are designed systematically and easily using the GA method. These generated networks require fewer training cycles then networks used until now, and a rate of convergence is improved.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call