Abstract

This live demonstration will show real-time, embedded learning of gestures shown to a dynamics vision sensor in neuromorphic hardware. A multi-layer spiking neural network implemented in the Loihi neuromorphic processor partially trained on 11 classes of gestures will be able to learn new classes of gestures shown to the vision sensor by using a combination of transfer learning and local synaptic plasticity. Visitors will experience real-time learning of new classes of gestures they show to the vision sensor whose data is processed in real-time by the network on a connected neuromorphic chip.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call