Abstract

Today, most of the computer vision applications produce a huge computational load, which can become an issue for autonomous systems such as robots. This is mainly due to the image sensor readout, which permanently captures the image at a fixed rate and produces a relatively high throughput bitstream. Therefore, finding techniques minimizing the data throughput helps to drastically reduce power. Event-based image sensors are able to capture images with a low throughput bistream, thanks to a sample strategy eliminating temporal and spatial redundancies. This natively gives a data-compressed image, which favors lower storage and computation. This article presents an event-based image sensor incorporating a hybrid pixel matrix composed of two pixel types and an arbiterless asynchronous readout system. The results show an important bitstream reduction compared to that of a standard CMOS image sensor. A testchip of our event-based image sensor has been designed and is currently under fabrication.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call