This article deals with the problem of the recognition of human hand touch by a robot equipped with large area tactile sensors covering its body. This problem is relevant in the domain of physical human–robot interaction for discriminating between human and non-human contacts and to trigger and to drive cooperative tasks or robot motions, or to ensure a safe interaction. The underlying assumption used in this article is that voluntary physical interaction tasks involve hand touch over the robot body, and therefore the capability to recognize hand contacts is a key element to discriminate a purposive human touch from other types of interaction. The proposed approach is based on a geometric transformation of the tactile data, formed by pressure measurements associated to a non-uniform cloud of 3D points (taxels) spread over a non-linear manifold corresponding to the robot body, into tactile images representing the contact pressure distribution in two dimensions. Tactile images can be processed using deep learning algorithms to recognize human hands and to compute the pressure distribution applied by the various hand segments: palm and single fingers. Experimental results, performed on a real robot covered with robot skin, show the effectiveness of the proposed methodology. Moreover, to evaluate its robustness, various types of failures have been simulated. A further analysis concerning the transferability of the system has been performed, considering contacts occurring on a different sensorized robot part.
Read full abstract