The dynamics of neuronal systems are characterized by hallmark features such as oscillations and synchrony. However, it has remained unclear whether these characteristics are epiphenomena or are exploited for computation. Due to the challenge of selectively interfering with oscillatory network dynamics in neuronal systems, we simulated recurrent networks of damped harmonic oscillators in which oscillatory activity is enforced in each node, a choice well supported by experimental findings. When trained on standard pattern recognition tasks, these harmonic oscillator recurrent networks (HORNs) outperformed nonoscillatory architectures with respect to learning speed, noise tolerance, and parameter efficiency. HORNs also reproduced a many characteristic features of neuronal systems, such as the cerebral cortex and the hippocampus. In trained HORNs, stimulus-induced interference patterns holistically represent the result of comparing sensory evidence with priors stored in recurrent connection weights, and learning-induced weight changes are compatible with Hebbian principles. Implementing additional features characteristic of natural networks, such as heterogeneous oscillation frequencies, inhomogeneous conduction delays, and network modularity, further enhanced HORN performance without requiring additional parameters. Taken together, our model allows us to give plausible a posteriori explanations for features of natural networks whose computational role has remained elusive. We conclude that neuronal systems are likely to exploit the unique dynamics of recurrent oscillator networks whose computational superiority critically depends on the oscillatory patterning of their nodal dynamics. Implementing the proposed computational principles in analog hardware is expected to enable the design of highly energy-efficient and self-adapting devices that could ideally complement existing digital technologies.
Read full abstract