Abstract

Determining an effective architecture for multi-layer feedforward backpropagation neural networks can be a time-consuming effort. In general it requires human intervention in determining the number of layers, number of hidden cells, the learning rule and the learning parameters. Over the past few years several approaches to dynamically configure neural networks have been proposed, which remove most of the responsibility for choosing the correct network configuration from the user. As important as finding a viable network architecture for some given learning problem, is the need to obtain a minimal configuration. The total time required to emulate or simulate neural networks is largely dependent on the number of connections present in a network. Therefore, it is essential to provide pruning methods to reduce network complexity. In this paper two approaches to network pruning will be investigated: single- and multi-pass pruning. Their effectiveness is emphasized by applying them to several real world proble...

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.