Abstract

Attempting to imitate the brain’s functionalities, researchers have bridged between neuroscience and artificial intelligence for decades; however, experimental neuroscience has not directly advanced the field of machine learning (ML). Here, using neuronal cultures, we demonstrate that increased training frequency accelerates the neuronal adaptation processes. This mechanism was implemented on artificial neural networks, where a local learning step-size increases for coherent consecutive learning steps, and tested on a simple dataset of handwritten digits, MNIST. Based on our on-line learning results with a few handwriting examples, success rates for brain-inspired algorithms substantially outperform the commonly used ML algorithms. We speculate this emerging bridge from slow brain function to ML will promote ultrafast decision making under limited examples, which is the reality in many aspects of human activity, robotic control, and network optimization.

Highlights

  • Attempting to imitate the brain’s functionalities, researchers have bridged between neuroscience and artificial intelligence for decades; experimental neuroscience has not directly advanced the field of machine learning (ML)

  • Machine learning is based on Donald Hebb’s pioneering work; seventy years ago, he suggested that learning occurs in the brain through synaptic strength modifications[1]

  • The brain is comparatively slow, its computational capabilities outperform typical state-of-the-art artificial intelligence algorithms. Following this speed/capability paradox, we experimentally derive accelerated learning mechanisms based on small datasets, where their utilization on gigahertz processors is expected to lead to ultrafast decision making

Read more

Summary

Introduction

Attempting to imitate the brain’s functionalities, researchers have bridged between neuroscience and artificial intelligence for decades; experimental neuroscience has not directly advanced the field of machine learning (ML). The brain is comparatively slow, its computational capabilities outperform typical state-of-the-art artificial intelligence algorithms Following this speed/capability paradox, we experimentally derive accelerated learning mechanisms based on small datasets, where their utilization on gigahertz processors is expected to lead to ultrafast decision making. A new type of adaptive rule was experimentally observed based on dendritic signal arrival timing[7], which is similar to the slow adaptation mechanism currently attributed to synapses (links). This dendritic adaptation occurs on a faster timescale: it requires approximately five minutes, while synaptic modification requires tens of minutes or more

Methods
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call