Abstract

Abstract Recently, a significant improvement has been observed in the recognition rate in deep neural networks (DNNs). However, as the number of layers increases, additional computations and significant power consumption are required by the DNN. In this study, we propose a novel spiking neural network (SNN) that exhibits high recognition rate and reduced computational cost. If the reliability of the output of the current neural network (NN) is decided to be low, we feed forward the result to the input of the next NN. We use backpropagation learning algorithm to train the component NN. Since most of the decisions are made in the early stage, the proposed method shows approximately 83% reduction of the computational cost compared with the conventional SNN with the same recognition rate.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.