Abstract

Recent probabilistic interpretations of neural network models have suggested the formulaton of network operations in information-theoretic terms. In these interpretations, the neural network developes an assumed probability density function which represents its assumptions on the environment. Using a set of hypotheses, this probability density functon is shown to maintain an exponential relationship wth an energy-like functon that the network tends to minimize. The purpose of this note is to obtain this probability density function through Shannon's dervation of the entropy measure and Jaynes' maximum entropy principle. The main conclusion is that the neural network assumes the worst case (i.e. most uncertain or maximum-entropy) probability density function for the unknown environment.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call