Abstract

This paper discusses co-designing integrated in-sensor and in-memory computing based on the analysis of event data and gives a system-level solution. By integrating an event-based vision sensor (EVS) as a sensor and event-driven computation-in-memory (CiM) as a processor, event data taken by EVS are processed in CiM. In this work, EVS is used to acquire the scenery from a driving car and the event data are analyzed. Based on the EVS data characteristics of temporally dense and spatially sparse, event-driven SRAM-CiM is proposed for extremely energy-efficient edge computing. In the event-driven SRAM-CiM, a set of 8T-SRAMs stores multiple-bit synaptic weights of spiking neural networks. Multiply-accumulate operation with the multiple-bit synaptic weights is demonstrated by pulse amplitude modulation and pulse width modulation. By considering future EVS of high image resolution and high time resolution, the configuration of event-driven CiM for EVS is discussed.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call