Abstract

Brain-body interactions guide the development of behavioral and cognitive functions. Sensory signals during behavior are relayed to the brain and evoke neural activity. This feedback is important for the organization of neural networks via neural plasticity, which in turn facilitates the generation of motor commands for new behaviors. In this study, we investigated how brain-body interactions develop and affect reward-based learning. We constructed a spiking neural network (SNN) model for the reward-based learning of canonical babbling, i.e., combination of a vowel and consonant. Motor commands to a vocal simulator were generated by SNN output and auditory signals representing the vocalized sound were fed back into the SNN. Synaptic weights in the SNN were updated using spike-timing-dependent plasticity (STDP). Connections from the SNN to the vocal simulator were modulated based on reward signals in terms of saliency of the vocalized sound. Our results showed that, under auditory feedback, STDP enabled the model to rapidly acquire babbling-like vocalization. We found that some neurons in the SNN were more highly activated during vocalization of a consonant than during other sounds. That is, neural dynamics in the SNN adapted to task-related articulator movements. Accordingly, body representation in the SNN facilitated brain-body interaction and accelerated the acquisition of babbling behavior.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call