Abstract

We extend our earlier work on positive reciprocal kernels of Fredholm integral operators [9] to study firings and their synthesis in neural networks. First we show that, in general, neural response in both spontaneous and non-spontaneous firing give rise to generalized functions of the Dirac type. For spontaneous unstimulated firing we solve a homogeneous eigenvalue equation and obtain a family of gamma type functions. Finite linear combinations of these functions are dense in Sobolev spaces. The solution of the inhomogenous equation representing non-spontaneous firing belongs to these spaces. Next we show that according to known facts about neural networks, the forcing function of the inhomogeneous equation is a linear combination of the above functions and can be used to represent the synthesis of stimuli within a neuron causing it to fire. We also show that the solution of the inhomogeneous equation can be expressed as a linear combination of the basic functions describing the neural response to those stimuli. The need for a firing threshold, characteristic of the Dirac distribution, emerges as a necessary condition for the existence of a solution. Second, we study the synthesis of the response of several neurons in both hierarchic and feedback network arrangements. The analysis is then briefly generalized to examine response to several stimuli and represent it as a direct sum of topological spaces. One observation is that generalized functions are appropriate rcprescntations of neural firings. Another is that understanding the structure of this representation is facilitated by the inevitable use of a fundamental set of dense functions to deal with the operations of a very complex system. 2.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call