Abstract

Brain-inspired computing technique presents a promising approach to prompt the rapid development of artificial general intelligence (AGI). As one of the most critical aspects, spiking neural networks (SNNs) have demonstrated superiority for AGI, such as low power consumption. Effective training of SNNs with high generalization ability, high robustness, and low power consumption simultaneously is a significantly challenging problem for the development and success of applications of spike-based machine intelligence. In this research, we present a novel and flexible learning framework termed high-order spike-based information bottleneck (HOSIB) leveraging the surrogate gradient technique. The presented HOSIB framework, including second-order and third-order formation, i.e., second-order information bottleneck (SOIB) and third-order information bottleneck (TOIB), comprehensively explores the common latent architecture and the spike-based intrinsic information and discards the superfluous information in the data, which improves the generalization capability and robustness of SNN models. Specifically, HOSIB relies on the information bottleneck (IB) principle to prompt the sparse spike-based information representation and flexibly balance its exploitation and loss. Extensive classification experiments are conducted to empirically show the promising generalization ability of HOSIB. Furthermore, we apply the SOIB and TOIB algorithms in deep spiking convolutional networks to demonstrate their improvement in robustness with various categories of noise. The experimental results prove the HOSIB framework, especially TOIB, can achieve better generalization ability, robustness and power efficiency in comparison with the current representative studies.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call