Abstract
In this paper, we present a first-of-its-kind method to determine clear and repeatable guidelines for single-shot camera intrinsic calibration using multiple checkerboards. With the help of a simulator, we found the position and rotation intervals that allow optimal corner detector performance. With these intervals defined, we generated thousands of multiple checkerboard poses and evaluated them using ground truth values, in order to obtain configurations that lead to accurate camera intrinsic parameters. We used these results to define guidelines to create multiple checkerboard setups. We tested and verified the robustness of the guidelines in the simulator, and additionally in the real world with cameras with different focal lengths and distortion profiles, which help generalize our findings. Finally, we used a 3D LiDAR (Light Detection and Ranging) to project and confirm the quality of the intrinsic parameters projection. We found it possible to obtain accurate intrinsic parameters for 3D applications, with at least seven checkerboard setups in a single image that follow our positioning guidelines.
Highlights
Navigation robots, such as autonomous vehicles, require a highly accurate representation of their surroundings to navigate and reach their target safely
We found that the corner detector peak performance with respect to roll rotation between the camera plane normal and the checkerboard normal is between 0 and
We know that the corner detector performs better the closer the checkerboard is to the camera, in these experiments we evaluated the impacts of checkerboard scale on the accuracy of the intrinsic parameters
Summary
Navigation robots, such as autonomous vehicles, require a highly accurate representation of their surroundings to navigate and reach their target safely. Sensors such as cameras, radars, and LiDARs (Light Detection and Ranging) are commonly used to provide rich perception information. Radars, and LiDARs (Light Detection and Ranging) are commonly used to provide rich perception information Each of these sensors can complement each other to supply reliable and accurate data. Cameras cannot provide reliable depth information at longer distances. LiDARs capture dense and highly accurate range information at short, middle, and often at long range regardless of the lighting conditions
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.