Abstract

Although the previous studies have shown that an emotional context may alter touch processing, it is not clear how visual contextual information modulates the sensory signals, and at what levels does this modulation take place. Therefore, we investigated how a toucher’s emotional expressions (anger, happiness, fear, and sadness) modulate touchee’s somatosensory-evoked potentials (SEPs) in different temporal ranges. Participants were presented with tactile stimulation appearing to originate from expressive characters in virtual reality. Touch processing was indexed using SEPs, and self-reports of touch experience were collected. Early potentials were found to be amplified after angry, happy and sad facial expressions, while late potentials were amplified after anger but attenuated after happiness. These effects were related to two stages of emotional modulation of tactile perception: anticipation and interpretation. The findings show that not only does touch affect emotion, but also emotional expressions affect touch perception. The affective modulation of touch was initially obtained as early as 25 ms after the touch onset suggesting that emotional context is integrated to the tactile sensation at a very early stage.

Highlights

  • If different emotional expressions immediately cause differences in activity in the somatosensory cortex (SCx), it follows that emotional modulation happens before extraction of tactile features—as enabled by the SCx itself— is completed

  • Ratings of intensity and forcefulness were combined (r = 0.68), as were ratings of pleasantness and friendliness (r = 0.48). This resulted in four measures that were analysed using four repeated-measures analyses of variance (ANOVAs) with touch type and emotional expression as factors

  • People process touch differently depending on the surrounding socio-emotional cues[6]

Read more

Summary

Introduction

We12 showed that emotional stimuli—fair and unfair propositions—modulated touch processing but found the effect in the later (>2​ 00 ms) temporal range of the SEP and in the reverse direction. It is not clear how visual contextual information modulates the sensory signals, and at what levels does this modulation take place[5]. A direct investigation of how emotional expressions modulate a truly interpersonal touch has far remained a technical impossibility This would require seeing the emotional facial expressions of others and feeling their touch, while precisely controlling both. We investigated multiple basic emotional expressions to better align with multidimensional theories of emotion[17]

Methods
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call