Abstract
AbstractWe propose here a back‐propagation neural network with built‐in time delay elements (back‐propagation neural networks including time delay elements: BPD) where the delay elements are connected so that an output is self‐fedback per neuron constituting the neural network. The learning algorithm for the BPD can be obtained by the most gradient descent method. The processing methods are classified into four types according to the degree of simplification in the course of formulation and whether or not the numerical calculation using the perturbation is introduced in obtaining a differential value. As applied problems, four types of problems are formed based on the combinations in which the input‐output signals of the neural network are analog signals or digital signals. For these four types of problems, the BPD is computer‐simulated by utilizing the four types of processing methods. In addition, which processing method is preferable is examined with respect to the learning processing results and processing time. It will be confirmed that in the BPD a sufficient learning processing effect can be obtained by utilizing a method where a secondary effect is ignored and the formulation is simplified. Moreover, SCNN, Jordan's and Elman's networks are taken as examples of the conventional recurrent neural networks which can handle the time‐sequence problems. Then, the results with the conventional neural networks are compared and examined when adapted to the above‐mentioned four‐type applied problems to confirm the effectiveness of the neural network proposed.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have