Abstract

This paper investigates a new methodology and structure for the neural network (NN) to enhance nonlinear multi-input multi-output (MIMO) signal processing. The new methodology depends on Legendre series expansion for the input pattern vectors. The proposed structure employs a flat single layer of neurons with linear transfer functions. This eliminates the hidden layers, the sigmoid non-linear transfer functions and back-propagation commonly employed in the conventional NN. The orthogonality offered by Legendre series improves the convergence properties of the proposed Legendre neural network (LNN). The nonlinearity of Legendre series plays the rule of the sigmoid non-linear transfer functions in the conventional NN. The linear transfer functions adopted provide the proposed LNN with the great advantage of providing solid and explicit formulae relating the input and target pattern vectors for any MIMO system at any field. A fast and uniform multi input/output LMS Newton type adaptive algorithm has been explored for training the proposed LNN in an incremental mode. The employment and improved performance of the proposed LNN in the field of modelling/simulation are illustrated through simulation experiments.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call