Abstract

Neural network design aims for high classification accuracy and low network architecture complexity. It is also known that simultaneous optimization of both model accuracy and complexity improves generalization while avoiding overfitting on data. We describe a neural network training procedure that uses multi-objective optimization to evolve networks which are optimal both with respect to classification accuracy and architectural complexity. The NSGA-II algorithm is employed to evolve a population of neural networks that are minimal in both training error and a Minimum Description Length-based network complexity measure. We further propose a pruning rule based on the following heuristic: connections to or from a node may be severed if their weight values are smaller than the network’s smallest bias. Experiments on benchmark datasets show that the proposed evolutionary multi-objective approach to neural network design employing the bias-based pruning heuristic yields networks that have far fewer connections without seriously compromising generalization performance when compared to other existing evolutionary optimization algorithms.KeywordsNeural NetworkMultiobjective OptimizationHide NodeOutput NodeTraining ErrorThese keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call