Abstract

This paper describes the cascade neural network design algorithm (CNNDA), a new algorithm for designing compact, two-hidden-layer artificial neural networks (ANNs). This algorithm determines an ANN's architecture with connection weights automatically. The design strategy used in the CNNDA was intended to optimize both the generalization ability and the training time of ANNs. In order to improve the generalization ability, the CNDDA uses a combination of constructive and pruning algorithms and bounded fan-ins of the hidden nodes. A new training approach, by which the input weights of a hidden node are temporarily frozen when its output does not change much after a few successive training cycles, was used in the CNNDA for reducing the computational cost and the training time. The CNNDA was tested on several benchmarks including the cancer, diabetes and character-recognition problems in ANNs. The experimental results show that the CNNDA can produce compact ANNs with good generalization ability and short training time in comparison with other algorithms.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.