Abstract

This correspondence paper provides, to the best of our knowledge, a first analysis of how biologically-plausible spiking neural networks (SNNs) equipped with Spike-Timing-Dependent Plasticity (STDP) can learn to detect people on the fly from non-independent and identically distributed (non-i.i.d) streams of retina-inspired, event camera data. Our system works as follows. First, a short sequence of event data capturing a walking human from a flying drone is forwarded in its natural order to an SNN-STDP system, which also receives teacher spiking signals from the neural activity readout block. Then, when the end of the learning sequence is reached, the learned system is assessed on testing sequences. In addition, we also present a new interpretation of anti-Hebbian plasticity as an over-fitting control mechanism, and provide experimental demonstrations of our findings. This work contributes to the study of attention-based development and perception in bio-inspired systems.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call