Wearable devices, such as data gloves and electronic skins, can perceive human instructions, behaviors and even emotions by tracking a hand's motion, with the help of knowledge learning. The shape or position single-mode sensor in such devices often lacks comprehensive information to perceive interactive gestures. Meanwhile, the limited computing power of wearable applications restricts the multimode fusion of different sensing data and the deployment of deep learning networks. We propose a perceptive fusion electronic skin (PFES) with a bioinspired hierarchical structure that utilizes the magnetization state of a magnetostrictive alloy film to be sensitive to external strain or magnetic field. Installed at the joints of a hand, the PFES realizes perception of curvature (joint shape) and magnetism (joint position) information by mapping corresponding signals to the two-directional continuous distribution such that the two edges represent the contributions of curvature radius and magnetic field, respectively. By autonomously selecting knowledge closer to the user's hand movement characteristics, the reinforced knowledge distillation method is developed to learn and compress a teacher model for rapid deployment on wearable devices. The PFES integrating the autonomous learning algorithm can fuse curvature-magnetism dual information, ultimately achieving human machine interaction with gesture recognition and haptic feedback for cross-space perception and manipulation.
Read full abstract