Abstract

The paper describes an algorithm that can be used to train the Takagi-Sugeno (TS) type neuro-fuzzy network very efficiently. The training algorithm is very efficient in the sense that it can bring the performance index of the network, such as the sum squared error (SSE), down to the desired error goal much faster than that the classical backpropagation algorithm (BPA). The proposed training algorithm is based on a slight modification of the Levenberg-Marquardt training algorithm (LMA) which takes into account the modified error index extension of sum squared error as the new performance index of the network. The Levenberg-Marquardt algorithm uses the Jacobian matrix in order to approximate the Hessian matrix and that is the most important and difficult step in implementing this LMA. Therefore, a simple technique has been described to compute first the transpose of Jacobian matrix by comparing two equations and thereafter by further transposing the former one the actual Jacobian matrix is computed that is found to be robust against the modified error index extension. Furthermore, care has been taken to suppress or control the oscillation magnitude during the training of neuro-fuzzy network. Finally, the above training algorithm is tested on neuro-fuzzy modeling and prediction applications of time series and nonlinear plant.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.