Abstract
In this paper, a Binary Neural Network Classifier (BNNC) is proposed in which hidden layer training is done in parallel. Learning Algorithm for the BNNC is described, which is based on the principle of Fast Covering Learning Algorithm (FCLA) proposed by Wang and Chaudhari [1]. The BNNC offers high degree of parallelism in hidden layer formation. Each module in the hidden layer of BNNC is exposed to the patterns of only one class. For achieving better accuracy, issue of overlapped classes are also handled. The method is tested on few benchmark datasets, accuracies are within the acceptable range. Due to parallelism at hidden layer level, training time is decreased, therefore, it can be used for voluminous realistic database. An analytical formulation is developed to evaluate the number of hidden layer neurons, it is in the 0(log(N)), where N represents the number of inputs.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.