Abstract

In this paper, two methods named Backward Computation (BC) and Forward Computation (FC) for both on-line and batch backward gradient computation of a system output (for sensitivity analysis) or cost function (for learning) with respect to system parameters are derived by the Signal-Flow-Graph representation theory and its known properties. The system can be any causal, in general non-linear and time-variant, dynamic system represented by a SFG, in particular any feedforward, time delay or recurrent neural network In this work, we use discrete time notation, but the same theory holds for the continuous time case. The gradient is obtained in a straightforward way by the analysis of two SFGs, the original one and its adjoint (for the BC method) or its derivative (FC method) both obtained from the first by simple transformations without the complex chain rule expansions of derivatives usually employed.The BC and FC methods are dual and the adjoint and derivative SFGs (of the same SFG) can be obtained one from the other by graph transformations. The BC method is local in space but not in time while the FC is local in time but not in space.KeywordsCost FunctionRecurrent Neural NetworkGraph TransformationInfinite Impulse ResponseForward ComputationThese keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call