Abstract

Event cameras offer many advantages for dynamic robotics due to their low latency response to motion, high dynamic range, and inherent compression of the visual signal. Many algorithms easily achieve real-time performance when testing on off-line datasets, however with an increase in camera resolution and applications on fast-moving robots, latency-free operation is not guaranteed. The event-rate is not constant, but is proportional to the amount of movement in the scene, or the velocity of the camera itself. Recently, algorithms have instead reported a maximum event-rate that can be achieved in real-time. In this paper we present the event-driven framework used on the iCub robot, which closes the loop between algorithm processing rate and the actual event-rate of the camera in order to smoothly control and limit the latency, while allowing the algorithm to degrade gracefully when large bursts of events occur. We show two algorithms that process events differently from each other and demonstrate the trade-off between latency and algorithm performance that the framework provides.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call