Abstract

This paper presents an efficient approach for the use of recursive least square (RLS) learning algorithm in Takagi-Sugeno-Kang neural fuzzy systems. In the use of RLS, reduced covariance matrix, of which the off-diagonal blocks defining the correlation between rules are set to zeros, may be employed to reduce computational burden. However, as reported in the literature, the performance of such an approach is slightly worse than that of using the full covariance matrix. In this paper, we proposed a so-called enhanced local learning concept in which a threshold is considered to stop learning for those less fired rules. It can be found from our experiments that the proposed approach can have better performances than that of using the full covariance matrix. Enhanced local learning method can be more active on the structure learning phase. Thus, the method not only can stop the update for insufficiently fired rules to reduce disturbances in self-constructing neural fuzzy inference network but also raises the learning speed on structure learning phase by using a large backpropagation learning constant.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.