Abstract

AbstractMethods for socially-aware robot path planning are increasingly needed as robots and humans increasingly coexist in shared industrial spaces. The practice of clearly separated zones for humans and robots in shop floors is transitioning towards spaces where both humans and robot operate, often collaboratively. To allow for safer and more efficient manufacturing operations in shared workspaces, mobile robot fleet path planning needs to predict human movement. Accounting for the spatiotemporal nature of the problem, the present work introduces a spatiotemporal graph neural network approach that uses graph convolution and gated recurrent units, together with an attention mechanism to capture the spatial and temporal dependencies in the data and predict human occupancy based on past observations. The obtained results indicate that the graph network-based approach is suitable for short-term predictions but the rising uncertainty beyond short-term would limit its applicability. Furthermore, the addition of learnable edge weights, a feature exclusive to graph neural networks, enhances the predictive capabilities of the model. Adding workspace context-specific embeddings to graph nodes has additionally been explored, bringing modest performance improvements. Further research is needed to extend the predictive capabilities beyond the range of scenarios captured through the original training, and towards establishing standardised benchmarks for testing human motion prediction in industrial environments.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.