Abstract

When listening to music, humans can easily identify and move to the beat. Numerous experimental studies have identified brain regions that may be involved with beat perception and representation. Several theoretical and algorithmic approaches have been proposed to account for this ability. Related to, but different from the issue of how we perceive a beat, is the question of how we learn to generate and hold a beat. In this paper, we introduce a neuronal framework for a beat generator that is capable of learning isochronous rhythms over a range of frequencies that are relevant to music and speech. Our approach combines ideas from error-correction and entrainment models to investigate the dynamics of how a biophysically-based neuronal network model synchronizes its period and phase to match that of an external stimulus. The model makes novel use of on-going faster gamma rhythms to form a set of discrete clocks that provide estimates, but not exact information, of how well the beat generator spike times match those of a stimulus sequence. The beat generator is endowed with plasticity allowing it to quickly learn and thereby adjust its spike times to achieve synchronization. Our model makes generalizable predictions about the existence of asymmetries in the synchronization process, as well as specific predictions about resynchronization times after changes in stimulus tempo or phase. Analysis of the model demonstrates that accurate rhythmic time keeping can be achieved over a range of frequencies relevant to music, in a manner that is robust to changes in parameters and to the presence of noise.

Highlights

  • Humans have the ability to estimate and keep track of time over a variety of timescales in a host of different contexts ranging from sub-seconds to tens of seconds or more [1, 2]

  • We start with a demonstration of how the beat generator neuron (BG) learns to synchronize to an isochronous stimulus sequence

  • We describe how the BG learns a period by first utilizing a continuous time version of the gamma counters to derive a one-dimensional map

Read more

Summary

Introduction

Humans have the ability to estimate and keep track of time over a variety of timescales in a host of different contexts ranging from sub-seconds to tens of seconds or more [1, 2]. We utilize a form of time estimation that can span hours, days or years [5]. Many such examples involve the brain making a calculation over a single event, so-called “interval timing” [6, 7]. We instinctively move to the beat of a piece of music through a form of sensorimotor synchronization, so-called beat-based timing [8,9,10,11]. Doing so involves identifying an underlying beat within a piece of music and coordinating the frequency and timing of one’s movements to match this beat

Methods
Results
Discussion
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call