Abstract

The minimization of a free energy is often regarded as the key principle in understanding how the brain works and how the brain structure forms. In particular, a statistical-mechanics-based neural network model is expected to allow one to interpret many aspects of the neural firing and learning processes in terms of general concepts and mechanisms in statistical physics. Nevertheless, the definition of the free energy in a neural system is usually an intricate problem without an evident solution. After the pioneering work by Hopfield, several statistical-mechanics-based models have suggested a variety of definition of the free energy or the entropy in a neural system. Among those, the Feynman machine, proposed recently, presents the free energy of a neural system defined via the Feynman path integral formulation with the explicit time variable. In this study, we first give a brief review of the previous relevant models, paying attention to the troublesome problems in them, and examine how the Feynman machine overcomes several vulnerable points in previous models and derives the outcome of the firing or the learning rule in a (biological) neural system as the extremum state in the free energy. Specifically, the model reveals that the biological learning mechanism, called spike-timing-dependent plasticity, relates to the free-energy minimization principle. Basically, computing and learning mechanisms in the Feynman machine base on the exact spike timings of neurons, such as those in a biological neural system. We discuss the consequence of the adoption of an explicit time variable in modeling a neural system and the application of the free-energy minimization principle to understanding the phenomena in the brain.

Highlights

  • A neural network is a specific system to task with computing, memorizing, and thinking.The history of neural network modeling unfolded with the work of McCulloch and Pitts [1].They introduced the notion of a formal neuron as a two-state threshold element and showed how a network of such elements can perform logical calculations

  • It is crucial in theoretical neuroscience to develop an abstract neuron model without losing the essential features of the biological neurons for information processing or learning

  • A neuron model explaining the firing activity and the learning rule through the concepts and principles in statistical mechanics could be helpful for revealing the essence of neural computing and learning mechanism

Read more

Summary

Introduction

A neural network is a specific system to task with computing, memorizing, and thinking. Some neuron models describe the stochastic firing dynamics through differential equations with stochastic terms or Langevin dynamic equations; some other models do this through the Markov-chain Monte Carlo processes They describe the firing rule and/or the learning rule via statistical mechanism, and facilitate the use of statistical mechanics, which aims to explain the measurable properties of macroscopic systems on the basis of the behavior of the microscopic constituents of those systems. A statistical-mechanics-based model usually describes the firing rule and/or the learning rule as Markov-chain Monte Carlo processes of an energy function or gradient descents in a free energy. The Feynman machine utilizes the exact timing of neural firings as the essential quantity in the implementation of computing or learning mechanism, such as biological neurons performing a function or communication through the use of the sensitivity for spike timings on the millisecond time scale [14,15]

Prologue
Hopfield Network
Boltzmann Machine
Informatix Rule
Pseudo-Stochastic Learning Model
Feynman Machine
Learning Principle in the Feynman Machine
Discussion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.