Abstract

为提高场景摄像机的标定精度和标定效率,本文利用摄像机透镜成像原理,基于针孔成像模型,提出了一种基于单幅二维图像的摄像机标定方法。该方法利用最小二乘法直接计算世界坐标系与计算机图像坐标系之间的数学模型,有效解决了摄像机内外参数难以直接计算,且计算精度不高的问题。方法简单有效,实现摄像机快速标定的同时保证了标定精度和可靠性,并提高了标定效率。通常情况下,当空间存在6个已知点的三维世界坐标及其计算机图像坐标时,即可实现摄像机标定。增加空间已知点的数量,还可进一步提高标定精度。利用大量的标定图像数据进行验证,结果表明,该标定算法计算精度高,计算量小,提高了摄像机标定的可靠性和精确性,具有一定的理论意义和实用价值。 In order to improve the calibration accuracy and efficiency of the camera, a camera calibration method based on single two-dimensional image is proposed in this paper. Method of least squares is used to calculate the mathematical model between the world coordinate system and the computer image coordinate system, so that the problem that the Internal and external camera parameters and difficult to calculate directly and that the calculation accuracy is not high can be solved effectively. Calibration accuracy and reliability are guaranteed and also the calibration efficiency is improved in this method. Under normal circumstances, when there are three-dimensional world coordinates and its computer image coordinates of six known points in the space, then camera calibration can be realized. The calibration accuracy can be further improved if the number of space known point is increased. A large number of calibration image data is tested to improve that this calibration method has high accuracy and small amount of calculation, and improves the reliability and accuracy of the camera calibration. This method has theoretical significance and practical value.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.