Abstract
This paper describes a fully spike-based neural network for optical flow estimation from dynamic vision sensor data. A low power embedded implementation of the method, which combines the asynchronous time-based image sensor with IBM's TrueNorth Neurosynaptic System, is presented. The sensor generates spikes with submillisecond resolution in response to scene illumination changes. These spike are processed by a spiking neural network running on TrueNorth with a 1-ms resolution to accurately determine the order and time difference of spikes from neighbouring pixels, and therefore infer the velocity. The spiking neural network is a variant of the Barlow Levick method for optical flow estimation. The system is evaluated on two recordings for which ground truth motion is available, and achieves an average endpoint error of 11% at an estimated power budget of under 80mW for the sensor and computation.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: IEEE Transactions on Biomedical Circuits and Systems
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.