Abstract

Humans recognize others’ emotional states such as delight, anger, sorrow, and pleasure through their multimodal expressions. However, it is unclear how this capability of emotion perception is acquired during infancy. This paper presents a neural network model that reproduces the developmental process of emotion perception through an infant–caregiver interaction. This network comprises hierarchically structured restricted Boltzmann machines (RBMs) that receive multimodal expressions from a caregiver (visual, audio, and tactile signals in our current experiment) and learn to estimate her/his emotional states. We hypothesize that emotional categories of multimodal stimuli are acquired in a higher layer in the network owing to two important functions: 1) tactile dominance and 2) perceptual improvement. The former refers to that tactile sensors can detect emotional valence of stimuli such as positive, negative, and zero valence more directly than can other sensors due to characteristics of the nerve systems of the skin. This function was implemented as semisupervised learning in the model. The latter refers to developmental changes in the perceptual acuity, which was replicated by refining the variance parameters of the low-layered RBMs. Experimental results demonstrated that tactile dominance and perceptual improvement have the role of facilitating the differentiation of emotional states of multimodal expressions; however, the influences only appear when both functions are included in the model together. Considering our results from the psychological perspective may help to elucidate the neural and social mechanisms of the development of emotion perception.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.