Abstract

The fully connected cascade (FCC) networks are a recently proposed class of neural networks where each layer has only one neuron and each neuron is connected with all the neurons in its previous layers. In this paper we derive and describe in detail an efficient backpropagation algorithm (named BPFCC) for computing the gradient for FCC networks. Actually, the backpropagation in BPFCC is an elaborately designed process for computing the derivative amplification coefficients, which are essential for gradient computation. The average time complexity for computing an entry of the gradient is O(1). BPFCC needs to be called by training algorithms to do any useful work, and we wrote a program FCCNET for that purpose. Currently, FCCNET uses the Levenberg---Marquardt algorithm to train FCC networks, and the loss function for classification is designed based on a nonlinear extension of logistic regression. For two-class classification, we derive a Gauss---Newton-like approximation for the Hessian of the loss function, and when the number of classes is more than two, numerical approximation of the Hessian is used. Experimental results confirm the efficiency of BPFCC, and the validity of the companion techniques.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.