Abstract
Anticipation of human movements is of great importance for service robots, as it is necessary to avoid interferences and predict areas where human–robot collaboration may be needed. In indoor scenarios, human movements often depend on objects with which they interacted before. For example, if a human interacts with a cup the probability that a table or coffee machine might be the next navigation goal is high. Typically, objects are grouped together in regions depending on the related activities so that environments consist of a set of activity regions. For example, a workspace region may contain a PC, a chair, and a table with many smaller objects on top of it. In this article, we present an approach to predict the navigation goal of a moving human in indoor environments. We hereby combine prior knowledge about typical human transitions between activity regions with robot observations about the human’s current pose and the last object interaction to predict the navigation goal using Bayesian inference. In the experimental evaluation in several simulated environments we demonstrate that our approach leads to a significantly more accurate prediction of the navigation goal in comparison to previous work. Furthermore, we show in a real-world experiment how such human motion anticipation can be used to realize foresighted navigation with an assistance robot, i.e. how predicted human movements can be used to increase the time efficiency of the robot’s navigation policy by early anticipating the user’s navigation goal and moving towards it.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.