Abstract

A new multiperspective stereo concept for real-time 3-D panoramic vision is presented in this paper. The main contribution is a novel event-driven stereo approach enabling 3-D 360° high-dynamic-range panoramic vision for real-time application in a natural environment. This approach makes use of a sparse visual code generated by a rotating pair of dynamic vision line sensors. The use of this system allows panoramic images to be generated by the transformation of events, eliminating the need to capture a large set of images. It thereby increases the acquisition speed, which improves accuracy in dynamic scenes. This paper focuses on its 3-D reconstruction and performance analysis using such a rotating multiperspective vision system. First, a theoretical analysis of the stereo matching accuracy is performed. Second, a depth error formulation is developed, which takes motion into consideration and reveals the leverage of scene dynamics on depth estimation. In this paper, disparity is measured in time units, which allows accurate depth maps to be estimated from a moving sensor system. Third, a stereo matching workflow is presented using standard stereo image matching to assess the 3-D reconstruction accuracy. Finally, experimental results are reported on real-world sensor data, showing that the system allows for the 3-D reconstruction of high-resolution round views even under challenging illumination conditions.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call