Abstract

Humanoid robots still have a low-level perception of their surroundings, which is a formidable obstacle to performing complex tasks. To improve their perceptual capabilities, high-level semantic information can be incorporated into their perceptual framework so that they can gain the ability to infer and understand their environment accurately. Among the various techniques available, semantic Simultaneous Localization and Mapping (SLAM) becomes a promising avenue for achieving this perception enhancement. While semantic SLAM can enhance a robot’s ability to perform tasks, the accurate perception, and association of semantic objects in complex and dynamic environments remain challenging. Hence, the need arises for a solution that swiftly and precisely associates object measurements with landmarks, considering their motion properties, and promptly rectifies erroneous associations in real time. To this end, we propose a semantic perception approach designed explicitly for dynamic environments, adept at distinguishing between dynamic and static objects. Furthermore, we propose two association strategies: dynamic object association based on semantic map points and static object association based on object pose information. And as the number of object measurements associated with the landmarks increases, we perform the association validation algorithm to verify the association for the landmark to improve the accuracy of the association. The proposed method is extensively evaluated on both simulated indoor sequences obtained from humanoid robot viewpoints and the KITTI dataset. Experimental results show that our approach significantly improves the robustness and accuracy of object association and trajectory estimation.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.