Abstract

Computer vision systems have demonstrated to be useful in applications of autonomous navigation, especially with the use of stereo vision systems for the three-dimensional mapping of the environment. This article presents a novel camera calibration method to improve the accuracy of stereo vision systems for three-dimensional point localization. The proposed camera calibration method uses the least square method to model the error caused by the image digitalization and the lens distortion. To obtain particular three-dimensional point coordinates, the stereo vision systems use the information of two images taken by two different cameras. Then, the system locates the two-dimensional pixel coordinates of the three-dimensional point in both images and coverts them into angles. With the obtained angles, the system finds the three-dimensional point coordinates through a triangulation process. The proposed camera calibration method is applied in the stereo vision systems, and a comparative analysis between the real and calibrated three-dimensional data points is performed to validate the improvements. Moreover, the developed method is compared with three classical calibration methods to analyze their advantages in terms of accuracy with respect to tested methods.

Highlights

  • IntroductionApplications like manufacturing process, structural health monitoring, microsurgery, laparoscopic surgery, and specially in autonomous navigation have used three-dimensional (3-D) measuring techniques.[1,2,3,4,5] In these applications, the accuracy is essential for the tasks that must be developed; there are methods to improve the accuracy of the 3-D measurements.[6,7] In autonomous navigation systems, the aim is to move an autonomous

  • Research in autonomous navigation applications has focused on stereo vision, which is used for 3-D mapping, detection, and location of objects.[12,13,14]

  • The developed Stereo vision systems (SVS) is able to locate 3-D points in a scene by intensity pattern match localization methods and performing a calibration method to improve accuracy of the measurements

Read more

Summary

Introduction

Applications like manufacturing process, structural health monitoring, microsurgery, laparoscopic surgery, and specially in autonomous navigation have used three-dimensional (3-D) measuring techniques.[1,2,3,4,5] In these applications, the accuracy is essential for the tasks that must be developed; there are methods to improve the accuracy of the 3-D measurements.[6,7] In autonomous navigation systems, the aim is to move an autonomous. The angles Bij, Cij, and bij are adjusted by adding each DBij, DCij, and Dbij to its respective angle, obtaining the calibrated angles Bcij, Ccij, and bcij (equations (12) to (14)) In this step, the central coordinates in pixels of all the crosses of the test grid are located for both cameras. Triangulation process is widely used in multiple applications to locate point coordinates in a scene.[43,44,45] In the currently developed SVS, the triangulation is performed with the base line of the cameras and the center point position of the crosses in the test grid located in the left and right images. A set of triangulation equations were developed derived from the law of sines (equations (20) to (22))[21,46,47]

À cos Bij Á sin Cij 2 sin ðBij þ CijÞ ð21Þ zij sin b
Calibration Method
Findings
Conclusions
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call