Abstract

In this work, we propose a novel method of estimating optical flow from event-based cameras by matching the time surface of events. The proposed loss function measures the timestamp consistency between the time surface formed by the latest timestamp of each pixel and the one that is slightly shifted in time. This makes it possible to estimate dense optical flows with high accuracy without restoring luminance or additional sensor information. In the experiment, we show that the gradient was more correct and the loss landscape was more stable than the variance loss in the motion compensation approach. In addition, we show that the optical flow can be estimated with high accuracy by optimization with L1 smoothness regularization using publicly available datasets.

Highlights

  • Event-based cameras are bio-inspired vision sensors that asynchronously output perpixel brightness changes as the event stream instead of video frames [1]

  • We propose the loss function measuring the timestamp consistency of the time surface for optical flow estimation using event-based cameras

  • We evaluate the dense optical flow estimated by optimization with L1 smoothness regularization

Read more

Summary

Introduction

Event-based cameras are bio-inspired vision sensors that asynchronously output perpixel brightness changes as the event stream instead of video frames [1]. Event-based cameras are suitable for optical flow estimation since the precise timestamp at pixel-level intensity changes directly encode fine grain motion information. Several attempts have been made to apply a technique using spatiotemporal image derivatives and an assumption of brightness constancy [5,6] to event-based vision. Benosman et al [7] and Brosch et al [8] proposed that the spatial image derivative was approximated using the integration of events and applied to the optical flow constraint. Bardow et al [9] estimated optical flow while simultaneously restoring image brightness from events only. It is preferable to utilize the precise timing information than to approximate the image gradient or restore the brightness using events for which absolute brightness information is lost

Objectives
Methods
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call