Abstract

Abstract. This paper presents a method of visual LiDAR odometry and forest mapping, leveraging tree trunk detection and LiDAR localization techniques. In environments like dense forests, where smooth GPS signals are unreliable, we employ camera and LiDAR sensors to accurately estimate the robot's position. However, forested or orchard settings introduce unique challenges, including a diverse mixture of trees, tall grass, and uneven terrain. To address these complexities, we propose a distance-based filtering method to extract data composed solely of tree trunk information from 2D LiDAR. By restoring arc data from the LiDAR sensor to its circular shape, we obtain position and radius measurements of reference trees in the LiDAR coordinate system. Then, these values are stored in a comprehensive tree trunk database. Our approach combines visual-based SLAM and LiDAR-based SLAM independently, followed by an integration step using the Extended Kalman Filter (EKF) to improve odometry estimation. Utilizing the obtained odometry information and the EKF, we generate a tree map based on observed trees. In addition, we use the tree position in the map as the landmark to reduce the localization error in the proposed SLAM algorithm. Experimental results show that the loop-closing error ranges between 0.3 to 0.5 meters. In the future, it is expected that this method will also be applicable in the fields of path planning and navigation.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.