Abstract
In this research chapter, a new method for building a point cloud for an object in a static scene is presented. The method uses images taken by an RGB camera mounted on a controllable robot while moving around that object. During the estimation of the pose of every video frame, a selection method is applied to extract the best frames from the video. Based on these selected images and their estimated rotation and transition vectors, a sparse 3D reconstruction process is conducted. The estimation of these vectors is done by applying Extended Kalman Filter to solve the Simultaneous Localization and Mapping (SLAM) problem with ROS (Robotics Operating System) as a framework. Covariance information provided by Kalman filter is utilized as additional selection criterion. Then, a ROS-based sparse bundle adjustment (SBA) process is performed on both of the new point cloud and the estimated pose vectors. Finally, a dense 3D reconstruction is performed on the optimized values of the rotations and transitions vectors to get a denser point cloud. This method is tested using simulation in Gazebo framework and the results are discussed. All the experiments are explained in details in this chapter. The source code of this project is available online and divided to two public repositories, one for the filtering phase (https://github.com/engyasin/EKF-MonoSLAM_for_3D-reconstruction) and the other for the 3D reconstruction phase (https://github.com/engyasin/3D-reconstruction_with_known_poses).
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.