Abstract

Humans have the ability to convey an array of emotions through complex and rich touch gestures. However, it is not clear how these touch gestures can be reproduced through interactive systems and devices in a remote mediated communication context. In this article, we explore the design space of device-initiated touch for conveying emotions with an interactive system reproducing a collection of human touch characteristics. For this purpose, we control a robotic arm to touch the forearm of participants with different force, velocity and amplitude characteristics to simulate human touch. In view of adding touch as an emotional modality in human-machine interaction, we have conducted two studies. After designing the touch device, we explore touch in a context-free setup and then in a controlled context defined by textual scenarios and emotional facial expressions of a virtual agent. Our results suggest that certain combinations of touch characteristics are associated with the perception of different degrees of valence and of arousal. Moreover, in the case of non-congruent mixed signals (touch, facial expression, textual scenario) not conveying a priori the same emotion, the message conveyed by touch seems to prevail over the ones displayed by the visual and textual signals.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.