Abstract
As a depth sensing approach, whilst stereo vision provides a good compromise between accuracy and cost, a key limitation is the limited field of view of the conventional cameras that are used within most stereo configurations. By contrast, the use of spherical cameras within a stereo configuration offers omnidirectional stereo sensing. However, despite the presence of significant image distortion in spherical camera images, only very limited attempts have been made to study and quantify omnidirectional stereo depth accuracy.In this paper we construct such an omnidirectional stereo system that is capable of real-time 360° disparity map reconstruction as the basis for such a study. We first investigate the accuracy of using a standard spherical camera model for calibration combined with a longitude-latitude projection for omnidirectional stereo, and show that the depth error increases significantly as the angle from the camera optical axis approaches the limits of the camera field of view.In contrast, we then consider an alternative calibration approach via the use of perspective undistortion with a conventional pinhole camera model allowing omnidirectional cameras to be mapped to a conventional rectilinear stereo formulation. We find that conversely this proposed approach exhibits improved depth accuracy at large angles from the camera optical axis when compared to omnidirectional stereo depth based on a spherical camera model calibration.
Accepted Version (Free)
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have