Abstract

Information theory is used to analyze the effects of finite resolution and nonlinearities in multi-layered networks. The authors formulate the effect on the information content of the output of a neural processing element caused by storing continuous quantities in binary registers. The analysis reveals that the effect of quantization on information in a neural processing element is a function of the information content of the input, as well as the node nonlinearity and the length of the binary register containing the output. By casting traditional types of neural processing in statistical form, two classes of information processing in neural networks are identified. Each has widely different resolution requirements. Information theory is thus shown to provide a means of formalizing this taxonomy of neural network processing and is a method for linking the highly abstract processing performed by a neural network and the constraints of its implementation. >

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call