Abstract

In this study, an adaptive object deformability-agnostic human-robot collaborative transportation framework is presented. The proposed framework enables to combine the haptic information transferred through the object with the human kinematic information obtained from a motion capture system to generate reactive whole-body motions on a mobile collaborative robot. Furthermore, it allows rotating the objects in an intuitive and accurate way during co-transportation based on an algorithm that detects the human rotation intention using the torso and hand movements. First, we validate the framework with the two extremities of the object deformability range (i.e., purely rigid aluminum rod and highly deformable rope) by utilizing a mobile manipulator which consists of an Omni-directional mobile base and a collaborative robotic arm. Next, its performance is compared with an admittance controller during a co-carry task of a partially deformable object in a 12-subjects user study. Quantitative and qualitative results of this experiment show that the proposed framework can effectively handle the transportation of objects regardless of their deformability and provides intuitive assistance to human partners. Finally, we have demonstrated the potential of our framework in a different scenario, where the human and the robot co-transport a manikin using a deformable sheet. <italic xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">Note to Practitioners</i> —Transportation of objects which requires the cooperation of multiple partners, is a common task in industrial settings such as factories and warehouses. The existing human-robot collaboration solutions for this task have focused only on purely rigid objects, although deformable objects need to be carried frequently in real-world applications. In this paper, we introduce a human-robot collaborative transportation framework that can handle objects with different deformability ranging from purely rigid to highly deformable. In particular, the proposed framework generates whole-body movements on a mobile collaborative robot by combining of the haptic information transmitted through the object and the human motion information obtained from a motion capture system. Moreover, the framework includes an intuitive way to rotate the object during the execution based on human hand and torso motion. The results of the experiments where objects with various deformability characteristics were transported in collaboration with a mobile manipulator demonstrated the high potential of the proposed approach in a laboratory setting. In the future, we plan to employ a less expensive vision-based human motion tracking system instead of the IMU-based system used in this study. With this change, we will be able to eliminate the need for wearable sensors from the framework presented, which would enhance its usability in real-world scenarios.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call