Abstract

In recent years, deep spiking neural networks (SNNs) have demonstrated promising performance across various applications, owing to their low-power characteristics. Research on SNN meta-learning has enabled SNNs to reduce both label cost and computational power consumption in few-shot classification tasks. However, current SNN meta-learning methods still lag behind traditional artificial neural networks (ANNs) in terms of accuracy. In this work, we explore a two-stage metric-based SNN meta-learning framework that achieves the highest accuracy performance in SNN. This framework comprises a pre-training stage and a meta-training stage. During pre-training, a classification embedding SNN model (CESM) is trained to extract image features. Subsequently, in the meta-training stage, the meta embedding SNN model (MESM) employs the centered kernel alignment (CKA) method to measure the similarity between these learned features for meta-learning. We conduct extensive experiments on the Omniglot, tieredImageNet, and miniImageNet datasets, evaluating both CESM and MESM models. Experimental results demonstrate that the proposed framework improves performance by 5% on average compared to previous SNN meta-learning approaches. The proposed method surpasses the early classical ANN methods and further closes the gap with ANN state-of-the-art methods.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call