Abstract

Artificial neural networks have been successfully used in the areas of speech recognition, computer vision and nonlinear function approximation. However, one of the essential problems with the exiting neural networks is model selection. Model selection is a methodology for choosing the adequate size of a neural network model to learn a task, yet not compromising the neural network performance. The paper outlines a biologically and evolutionary plausible iterative scheme to overcome the problem of model selection for a newly proposed modular neural architecture network, the modified hierarchical mixture of experts model. The proposed scheme is constructive in nature and employs embryo-genetic principles to iteratively generate a modular neural network of adequate size to solve the problem at hand. The effectiveness of the proposed iterative scheme is demonstrated by applying it to a benchmark classification problem.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call