Current research in human‐robot collaboration has yielded promising results, enabling us to predict human motion intentions and facilitate safe robot movement in shared spaces, avoiding collisions. However, achieving truly efficient human‐robot collaboration remains a complex and challenging endeavor. It entails not only imbuing robots with an understanding of human behavior but also empowering them to actively cooperate with human subjects and accomplish specific tasks. Against this backdrop, this paper introduces a comprehensive behavior perception and robot trajectory planner (BPT‐Planner). Leveraging continuous image data as input, the planner utilizes a CNN‐RNN network to predict human subjects' behavioral intentions, thereby gaining deeper insights. Based on these predictive outcomes and human‐robot action pairings, the planner generates a motion space for the robot. Subsequently, through the application of the proximal policy optimization algorithm, the planner dynamically optimizes the robot's action strategies to generate optimal motion trajectories. This optimization process takes into full consideration the behavior of human subjects, ensuring seamless integration of robot movement with human actions. To validate the performance of the BPT‐Planner, we conducted offline training and real‐time online validation in a pathological experiment scenario. Additionally, we conducted comparative experiments using datasets from other scenarios to further corroborate the planner's efficacy. Through the endeavors of this study, we not only showcase the exceptional performance of the BPT‐Planner across diverse scenarios but also affirm its applicability and effectiveness in various application environments. © 2025 Institute of Electrical Engineers of Japan and Wiley Periodicals LLC.
Read full abstract