Abstract

The neural network is a powerful computing framework that has been exploited by biological evolution and by humans for solving diverse problems. Although the computational capabilities of neural networks are determined by their structure, the current understanding of the relationships between a neural network’s architecture and function is still primitive. Here we reveal that a neural network’s modular architecture plays a vital role in determining the neural dynamics and memory performance of the network of threshold neurons. In particular, we demonstrate that there exists an optimal modularity for memory performance, where a balance between local cohesion and global connectivity is established, allowing optimally modular networks to remember longer. Our results suggest that insights from dynamical analysis of neural networks and information-spreading processes can be leveraged to better design neural networks and may shed light on the brain’s modular organization.

Highlights

  • Neural networks are the computing engines behind many living organisms

  • We have demonstrated that echo state networks (ESN) exhibit optimal modularity both in the context of signal spreading and memory capacity and they are closely linked to the optimal modularity for information spreading

  • Through dynamical analysis we found that balancing local and global cohesion enabled modular reservoirs to spread information across the network and consolidate distributed signals, alternative mechanisms may be in play, such as cycle properties [56]

Read more

Summary

Introduction

Neural networks are the computing engines behind many living organisms. They are prominent generalpurpose frameworks for machine learning and artificial intelligence applications [1]. The behavior of a neural network is determined by the dynamics of individual neurons, the topology and strength of individual connections, and largescale architecture. Both in biological and artificial neural networks, neurons integrate input signals and produce a graded or threshold-like response. In machine learning, feed-forward convolutional architectures have achieved super-human visual recognition capabilities [1, 4], while recurrent architectures exhibit impressive natural language processing and control capabilities [5]

Methods
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call