Abstract
Spiking Neural Networks (SNNs) are being explored to emulate the astounding capabilities of human brain that can learn to perform robust and efficient computations with noisy spikes. A variety of spiking neuron models have been proposed to resemble biological neuronal functionalities. The simplest and most commonly used among these SNNs are leaky-integrate-and-fire (LIF), which contain a leak path in their membrane potential and integrate-and-fire (IF), where the leakage path is absent. While the LIF models have been argued as more bio-plausible, a comparative analysis between models with and without leak from a purely computational point of view demands attention, which we try to address in this paper. Our results reveal that LIF model provides improved robustness and better generalization compared to IF. Frequency domain analysis demonstrates that leak aids in eliminating high-frequency components from the input, thus enhancing noise-robustness of SNNs. Additionally, we compare the sparsity of computation between these models. In general, for the same input, the LIF model would be expected to achieve higher sparsity compared to IF due to the layer-wise decay of spikes caused by membrane potential leak with time. However, contrary to this expectation, we observe that leak decreases the sparsity of computation. Therefore, there exists a trade-off between robustness and energy-efficiency in SNNs which can be optimized through suitable choice of amount of leak in the models.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.