Abstract

Obtaining 3-D data by LIDAR from unmanned aerial vehicles (UAVs) is vital for the field of remote sensing; however, the highly dynamic movement of UAVs and narrow viewpoint of LIDAR pose a great challenge to the self-localization for UAVs based on solely LIDAR sensor. To this end, we propose a robust simultaneous localization and mapping (SLAM) system, which combines the image data obtained by vision sensor and point clouds obtained by LIDAR. In the front-end of the proposed system, the more stable line and plane features are extracted from point clouds through clustering. Then the relative pose between two consecutive frames is computed by the least squares iterative closest point algorithm. Afterward, a novel direct odometry algorithm is developed by combining the image frames and sparse point clouds, where the relative pose is used as a prior. In the back-end, the pose estimation is refined and the 3-D map with texture information is built at a lower frequency. Extensive experiments show that our method can achieve robust and highly precise localization and mapping for UAVs.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.