Abstract

Multilayer perceptron has a large amount of classifications and regression applications in many fields: pattern recognition, voice, and classification problems. But the architecture choice in particular, the activation function type used for each neuron has a great impact on the convergence and performance. In the present article, the authors introduce a new approach to optimize the selection of network architecture, weights, and activation functions. To solve the obtained model the authors use a genetic algorithm and train the network with a back-propagation method. The numerical results show the effectiveness of the approach shown in this article, and the advantages of the new model compared to the existing previous model in the literature.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call