Abstract

In this paper, we propose a Charuco board-based omnidirectional camera calibration method to solve the problem of conventional methods requiring overly complicated calibration procedures. Specifically, the proposed method can easily and precisely provide two-dimensional and three-dimensional coordinates of patterned feature points by arranging the omnidirectional camera in the Charuco board-based cube structure. Then, using the coordinate information of the feature points, an intrinsic calibration of each camera constituting the omnidirectional camera can be performed by estimating the perspective projection matrix. Furthermore, without an additional calibration structure, an extrinsic calibration of each camera can be performed, even though only part of the calibration structure is included in the captured image. Compared to conventional methods, the proposed method exhibits increased reliability, because it does not require additional adjustments to the mirror angle or the positions of several pattern boards. Moreover, the proposed method calibrates independently, regardless of the number of cameras comprising the omnidirectional camera or the camera rig structure. In the experimental results, for the intrinsic parameters, the proposed method yielded an average reprojection error of 0.37 pixels, which was better than that of conventional methods. For the extrinsic parameters, the proposed method had a mean absolute error of 0.90° for rotation displacement and a mean absolute error of 1.32 mm for translation displacement.

Highlights

  • With the development of head-mounted displays, it has become possible to provide users with immersive virtual reality (VR)

  • For omnidirectional camera extrinsic calibration, we fix camera rigAfter and arrange thecorner proposed points of the Charuco board from the captured images, we find the mapping relation between the calibration structure to capture the images one-by-one for each camera

  • The performance the proposed calibration method divided into two parts: (i) conventional cameraevaluation calibration of methods, and with the method of Li etwas al. [31]

Read more

Summary

Introduction

With the development of head-mounted displays, it has become possible to provide users with immersive virtual reality (VR). In addition to computer graphics, capturing a real scene with a camera and transferring it to a VR space has become imperative to providing VR content. We deal with the polydioptric camera such as Facebook Surround 360 [10], Google Jump [11], Richo Theta [12], and Samsung Gear 360 [13]. These omnidirectional cameras are divergent structures and capture surrounding image information simultaneously through overlapping the images from multiple cameras.

Methods
Results
Conclusion

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.