Abstract

Mimicking human skin sensation such as spontaneous multimodal perception and identification/discrimination of intermixed stimuli is severely hindered by the difficulty of efficient integration of complex cutaneous receptor-emulating circuitry and the lack of an appropriate protocol to discern the intermixed signals. Here, a highly stretchable cross-reactive sensor matrix is demonstrated, which can detect, classify, and discriminate various intermixed tactile and thermal stimuli using a machine-learning approach. Particularly, the multimodal perception ability is achieved by utilizing a learning algorithm based on the bag-of-words (BoW) model, where, by learning and recognizing the stimulus-dependent 2D output image patterns, the discrimination of each stimulus in various multimodal stimuli environments is possible. In addition, the single sensor device integrated in the cross-reactive sensor matrix exhibits multimodal detection of strain, flexion, pressure, and temperature. It is hoped that his proof-of-concept device with machine-learning-based approach will provide a versatile route to simplify the electronic skin systems with reduced architecture complexity and adaptability to various environments beyond the limitation of conventional "lock and key" approaches.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call