Abstract

This paper presents several aspects with regards the application of the NARX model and Recurrent Neural Network (RNN) model in system identification and control. We show that every RNN can be transformed to a first order NARX model, and vice versa, under the condition that the neuron transfer function is similar to the NARX transfer function. If the neuron transfer function is piecewise linear, that is f(x):=x if uxu , 1 and f(x):=sign(x) otherwise, we further show that every NARX model of order larger than one can be transformed into a RNN. According to these equivalence results, there are three advantages from which we can benefit: (i) if the output dimension of a NARX model is larger than the number of its hidden unit, training an equivalent RNN will be faster, i.e. the equivalent RNN is trained instead of the NARX model. Once the training is finished, the RNN is transformed back to an equivalent NARX model. On the other hand, (ii) if the output dimension of a RNN model is less than the number of its hidden units, the training of a RNN can be speeded up by using a similar method; (iii) the RNN pruning can be accomplished in a much simpler way, i.e. the equivalent NARX model is pruned instead of the RNN. After pruning, the NARX model is transformed back to the equivalent RNN.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call