Abstract

This paper proposes a new type of neural network called the Dynamic Threshold Neural Network (DTNN) which is theoretically and experimentally superior to a conventional sigmoidal multilayer neural network in classification capability. Given a training set containing 4k + 1 patterns in Rn, to successfully learn this training set, the upper bound on the number of free parameters for a DTNN is (k + 1)(n + 2) + 2(k + 1), while the upper bound for a sigmoidal network is 2k(n + 1) + (2k + 1). We also derive a learning algorithm for the DTNN in a similar way to the derivation of the backprop learning algorithm. In simulations on learning the Two-Spirals problems, our DTNN with 30 neurons in one hidden layer takes only 3200 epochs on average to successfully learn the whole training set, while the single-hidden-layer feedforward sigmoidal neural networks have never been reported to successfully learn the given training set even though more hidden neurons are used.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.