Abstract

This paper develops the area of nonlinear modelling based on the nonlinear function approximation capabilities of the multi-layer perceptron network. The mechanism of this nonlinear modelling technique is explained in a novel yet straightforward manner. Important aspects such as model order, over-parameterisation and the choice of excitation signals are discussed. In order to provide familiar and useful model structures techniques are provided through which a trained neural model can be expanded to provide Volterra or polynomial NARX expressions. Likewise it is shown that linearisation of the network at each sample can generate the parameters for use with an adaptive linear plant model. In order to accelerate the training of the multi-layer perceptron, both full-memory and memoryless versions of the Broyden-Fletcher-Goldfharb-Shanno (BFGS) training algorithm are developed. A novel network pruning algorithm is provided that utilises the Hessian matrix generated by the full-memory BFGS algorithm. A parallel version of the memoryless BFGS algorithm is developed and mapped efficiently onto a pipeline of INMOS transputers. The performance of this parallel algorithm is demonstrated for the training of an inferential model for a realistic continuously-stirred-tank-reactor system. Finally, these neural modelling techniques are applied to provide an accurate predictive model of viscosity for an industrial polymerisation reactor.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.