Abstract

In this paper a dedicated recurrent neural network design and a model reduction approach are proposed in order to improve the balance between complexity and quality of black box nonlinear system identification models. The proposed neural network design, based on a three-layers architecture, helps to reduce the number of parameters of the model after the training phase without any significant loss of estimation accuracy. Nevertheless, the proposed architecture remains sufficiently general to provide a wide range of models among the most encountered in the literature. This reduction, achieved by a convenient choice of the activation functions and the initial conditions of the synaptic weights, is developed in two steps. The first step is to train the proposed architecture under two reasonable assumptions. Then the recurrent three-layers neural network is transformed into a representation of two-layer with less number of neurons, that is, a significant reduced number of parameters. The constructed architecture provided models with reasonable reduced number of parameters with a convenient estimation accuracy. To validate the proposed approach, we identify the Wiener–Hammerstein benchmark nonlinear system proposed in SYSID2009 [1].

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call