Abstract

In any real world application, the performance of Artificial Neural Networks (ANN) is mostly depends upon its generalization capability. Generalization of the ANN is ability to handle unseen data. The generalization capability of the network is mostly determined by system complexity and training of the network. Poor generalization is observed when the network is over-trained or system complexity (or degree of freedom) is relatively more than the training data. A smaller network which can fit the data will have the k good generalization ability. Network parameter pruning is one of the promising methods to reduce the degree of freedom of a network and hence improve its generalization. In recent years various pruning methods have been developed and found effective in real world applications. Next, it is important to estimate the improvement in generalization and rate of improvement as pruning being incorporated in the network. A method is developed in this research to evaluate generalization capability and rate of convergence towards the generalization. Using the proposed method, experiments have been conducted to evaluate Multi-Layer Perceptron neural network with pruning being incorporated for handwritten numeral recognition.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.