Abstract Neural computation frameworks are essential for advancing computational neuroscience and artificial intelligence, offering a robust platform for simulating intricate brain-like processes and fostering the growth of intelligent systems. This study presents TinySpiking, a novel, lightweight, and energy-efficient Python framework designed for the simulation and learning of Spiking Neural Networks (SNNs). Unlike traditional frameworks, TinySpiking does not depend on third-party libraries, thereby reducing computational overhead and energy consumption. The framework's innovation is that it implements unsupervised learning through Spike-Time Dependent Plasticity (STDP), a biologically inspired learning rule that adjusts synaptic weights based on the precise timing of neuronal spikes. This mechanism is crucial for constructing complex neural architectures and effectively processing spatio-temporal data, which is vital for the dynamic and intricate demands of real-world data analysis. Demonstrating the practical utility of TinySpiking, we applied it to image reconstruction tasks using the MNIST and landmark datasets. The results not only validate the network's ability to autonomously identify and enhance key visual features without external supervision but also highlight its efficiency in learning from data, mirroring the adaptability of biological neural systems. In conclusion, TinySpiking's innovative, lightweight design and its proven effectiveness in unsupervised learning tasks make it a standout computational tool for the fields of computational neuroscience and machine learning. Its low time consumption, biological plausibility, and independence from third-party libraries position it as a compelling platform for future research and applications, promising to drive advancements in unsupervised learning and intelligent system development.
Read full abstract