Abstract

Cochlear implants (CI) have had tremendous success restoring a sense of hearing in the deaf. However, even after months of intensive rehabilitation, many CI users struggle with appreciating emotive tones in speech and music despite good speech comprehension. Failure to perceive emotional expression can result in maladjusted social behaviour, leading to detrimental socio-economic consequences. Recent advances in automated pattern identification of neuroimaging data can bring empirical support to developing training programs for emotion perception rehabilitation in CI users. We used a machine-learning approach to identify emotion-processing bio-markers in high-density electroencephalograms collected from CI users (22) and matched normal-hearing controls (22). Participants’ brain responses elicited by short musical and vocal emotional (happy, sad, and neutral) stimuli were used to train an algorithm to help identify, in each group, the pattern of brain responses that can best predict the presented emotion. Using this approach, we were able to confirm the presence of emotion-specific patterns of brain activity in CI users despite their reported emotion perception deficit. Identifying these patterns brings forward support for implementing a rehabilitation program for emotion perception for this population; if an algorithm can differentiate aurally presented emotions, perhaps CI users can learn to discriminate emotions.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call