Abstract
ABSTRACT Event cameras are biologically inspired sensors that asynchronously detect brightness changes in the scene independently for each pixel. Their output is a stream of events which is reported with a low latency and high temporal resolution of a microsecond, making them superior to standard cameras in highly dynamic scenarios when they are sensitive to motion blur. Event cameras can be used in a wide range of applications, one of them being depth estimation, in both stereo and monocular settings. However, most known event-based depth estimation methods yield sparse depth maps due to the nature of the sparse event stream. We present a novel method that fuses information from both events and standard frames, as well as odometry, to exploit the advantages of both sensors. We propose to estimate dense disparity from standard frames at the point of their availability, predict the disparity using odometry information, and track the disparity asynchronously using optical flow of events between the standard frames. We present the performance of the method through several experiments in various setups, including synthetic data, KITTI dataset enhanced with events, MVSEC dataset, as well as our own stereo event camera recordings.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.