Abstract
Computer vision based applications have received notable attention globally due to the interaction with the physical world. In this paper, a novel method for object localization based on camera calibration and pose estimation is discussed. The 3-dimensional (3D) coordinates are computed by taking multiple 2-dimensional (2D) images from different view that meets the requirement in camera calibration process. Several number of steps are involved in camera calibration including estimation of intrinsic and extrinsic parameters for the removal of lens distortion, estimation of object’s size and camera location. Besides, a technique to estimate the 3D pose using 2D images is proposed and the results of camera parameters and localization are applied for the 3D reconstruction. The hardware implementation of the proposed approach is implemented on HP core i5 with the MATLAB support packages and experimental results are validated for both camera calibration and pose estimation.
Submitted Version (Free)
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have