Abstract
Spiking neural networks (SNNs) have shown much promise as an energy-efficient alternative to artificial neural networks (ANNs). Such methods trained by surrogate gradient (SG) descent are now capable of dealing with frame- or event-based vision tasks beyond classification (e.g. object detection, semantic segmentation). However, important questions about their behavior w.r.t fundamental design choices remain under-explored (e.g. how a specific neural coding scheme impacts the performance/robustness? Does a higher temporal latency necessarily imply better results?). In this paper, we focus on single object localization as a context to analyze deep convolutional SNNs on (1) the importance of temporal latency (i.e. the number of time-steps) on performance ; (2) their robustness against sensor corruptions ; and (3) the impact of neural coding schemes on performance with static images. We design a simple SNN baseline for frame- and event-based single object localization and compare it against a similar ANN architecture. Our experiments show that our approach can achieve competitive or better performance in accuracy and robustness against common sensor corruptions with significantly lower energy consumption. More importantly, our experimental analysis draws conclusions significantly different from well-known studies focused on SNNs trained with bio-plausible learning rules, which helps in the design of SG-trained architectures, and offers insight to design priorities in future neuromorphic technologies.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.