Abstract

Stereo vision is an important feature that enables machine vision systems to perceive their environment in 3D. While machine vision has spawned a variety of software algorithms to solve the stereo-correspondence problem, their implementation and integration in small, fast, and efficient hardware vision systems remains a difficult challenge. Recent advances made in neuromorphic engineering offer a possible solution to this problem, with the use of a new class of event-based vision sensors and neural processing devices inspired by the organizing principles of the brain. Here we propose a radically novel model that solves the stereo-correspondence problem with a spiking neural network that can be directly implemented with massively parallel, compact, low-latency and low-power neuromorphic engineering devices. We validate the model with experimental results, highlighting features that are in agreement with both computational neuroscience stereo vision theories and experimental findings. We demonstrate its features with a prototype neuromorphic hardware system and provide testable predictions on the role of spike-based representations and temporal dynamics in biological stereo vision processing systems.

Highlights

  • Digital bus, as soon as they generate a spike

  • The spiking neural network we propose is inspired by the well established cooperative network of Marr and Poggio[19], but is characterized by two major differences: first, the input to the network does not consist of static images but of dynamic spatio-temporal visual information in the form of spike trains which are directly obtained from event-based neuromorphic vision sensors; second, the network is composed of Integrate-and-Fire spiking neurons operating in a massively parallel fashion, which are self-timed and express temporal dynamics analogous to those of their real biological counterparts

  • In this paper we proposed a spiking neural network model that solves the stereo correspondence problem efficiently by exploiting an event-based representation, and that is amenable to neuromorphic hardware implementations

Read more

Summary

Introduction

Digital bus, as soon as they generate a spike. the neuromorphic processors we recently developed[10,11] use address-events to receive, process, and transmit spikes. Input address events represent spikes that are delivered to destination synapse circuits, and output address-events represent spikes produced by the source silicon neurons These neuromorphic processors typically comprise large arrays of silicon neurons that can be arbitrarily configured to implement different types of neural networks, for processing real-time event-based data such as data produced by silicon retinas[10]. The availability of these event-based neuromorphic sensors has led to an increased interest in studying and developing a new class of event-based vision algorithms[12,13,14,15,16,17]. As our model is well connected to established models of stereopsis, its implementation based on neurons and synapses that exhibit biologically realistic time constants suggests a potentially important role for precise temporal dynamics in biological stereo vision systems

Methods
Results
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.