Abstract

Traditional feedforward neural networks are static structures that simply map input to output. To better reflect the dynamics in the biological system, time dependency is incorporated into the network by using Finite Impulse Response (FIR) linear filters to model the processes of axonal transport, synaptic modulation, and charge dissipation. While a constructive proof gives a theoretical equivalence between the class of problems solvable by the FIR model and the static structure, certain practical and computational advantages exist for the FIR model. Adaptation of the network is achieved through an efficient gradient descent algorithm, which is shown to be a temporal generalization of the popular backpropagation algorithm for static networks. Applications of the network are discussed with a detailed example of using the network for time series prediction.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call