Abstract

Accurate and flexible calibration is a prerequisite for visual sensor networks to retrieve metric information from image data. This paper presents the design and implementation of an accurate and flexible calibration method for a class of visual sensor networks intended for 3D measurement and tracking in large volumes. The proposed method employs a generic camera model that is applicable for wide-angle lens cameras as well as for conventional cameras. It does not require all the employed cameras to share a common field of view, and only pairwise overlap is needed. In the calibration process, the poses between stereo cameras are first initialized using essential matrix decomposition and then optimized by the Levenberg-Marquardt algorithm. A weighted vision graph is proposed to select optimal transformation paths among cameras by using Dijkstra's shortest path algorithms for multi-camera calibration. Then, the global coordinates are constructed using a four-marker calibration triangle. Finally, a Unity3D-based virtual platform, in which the total number and configurations of cameras, as well as the environment scene, can be arbitrarily edited, is designed to test the proposed calibration algorithms. Extensive experiments based on synthetic and real data are performed to demonstrate the effectiveness of the proposed multi-camera calibration method. Experimental results show that the multi-camera calibration method is accurate and easy-to-implement in the presence of noise.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.