Abstract

Modern manufacturing systems are human robot systems that consist of human operators and intelligent robots collaborating with each other to accomplish complex tasks. The system performance of such human robot systems relies heavily on reliable and efficient human robot collaborations, which may be seriously compromised due to temporal variations in human to robot trust. This paper proposes to model the human trust as a Markov Decision Process (MDP) to capture its dynamic uncertainties on how the robot performance affects the human trust. The system performance under such trust based human robot collaboration is formulated as an optimization problem where an optimal task allocation policy is obtained to minimize the expected average cost on each cycle of the persistent tasks while maximizing the probability of satisfying Linear Temporal Logic specifications. The case study of an assembly process shows the effectiveness and benefits of our proposed trust based human robot collaboration.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call