Abstract

With the development of smart manufacturing, human-robot collaboration (HRC) is seen as the future of manufacturing. In a manufacturing environment with HRC, safety has been raising significant attention since the conventional separation of workspaces between robots and humans is removed. In this paper, a dynamic safety field is constructed that takes into account the collaborative assembly task requirements and the motions of both humans and robots. This safety field guides robot actions, ensuring that the robot carries out the required tasks while maintaining a safe distance from human workers. Based on the task requirements and safety considerations between humans and robots, a robot motion planning and control problem is formulated. To solve this problem, a hybrid RRL control scheme is proposed where a residual reinforcement learning (RRL) method is developed to combine the safety field-based control method with a deep reinforcement learning (DRL) method. Numerical studies are conducted to evaluate the performance of the proposed, which is compared with a pure deep Q-network (DQN) based method and a rapidly exploring random tree (RRT) method. The simulation results show that the proposed method can effectively optimize robot trajectories and outperforms the DQN-based and the RRT method in terms of computational efficiency.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call