Wearable human-machine interface (HMI) with bidirectional and multimodal tactile information exchange is of paramount importance in teleoperation by providing more intuitive data interpretation and delivery of tactilely related signals. However, the current sensing and feedback devices still lack enough integration and modalities. Here, we present a Tactile Sensing and Rendering Patch (TSRP) that is made of a customized expandable array which consists of a piezoelectric sensing and feedback unit fused with an elastomeric triboelectric multidimensional sensor and its inner pneumatic feedback structure. The primary functional unit of TSRP is mainly featured with a soft silicone substrate with compact multilayer structure integrating static and dynamic multidimensional tactile sensing capabilities, which synergistically leverage both triboelectric and piezoelectric effects. Additionally, based on the air chamber created by the triboelectric sensor and the converse piezoelectric effect, it provides pneumatic and vibrational haptic feedback simultaneously for both static and dynamic perception regeneration. With the aid of the other variants of this unit, the array shaped TSRP is capable of simulating different terrains, geometries, sliding, collisions, and other critical interactive events during teleoperation via skin perception. Moreover, immediate manipulation can be done on TSRP through the tactile sensors. The preliminary demonstration of TSRP interface with a completed control module in robotic teleoperation is provided, which shows the feasibility of assisting certain tasks in a complex environment by direct tactile communication. The proposed device offers a potential method of enabling bidirectional tactile communication with enriched key information for improving interaction efficiency in the fields of robot teleoperation and training.
Read full abstract