Abstract

In this paper, a novel learning approach to train fuzzy neural networks’ parameters based on calculating the desired outputs of their rules, is proposed. We describe the desired outputs of fuzzy rules as values that make the output error equal to the minimum. To find these desired outputs, a new constrained convex optimization problem is introduced and solved. Afterward, the parameters of fuzzy rules are trained to reduce the error between the current rules’ outputs and the estimated desired ones. Therefore, the proposed learning method avoids direct output error backpropagation, which leads to vanishing gradient and consequently getting stuck in a local optimum. Therefore, the proposed method does not need any sophisticated initialization method. This learning method is successfully utilized to train a new Takagi–Sugeno-Kang (TSK) Fuzzy Neural Network with correlated fuzzy rules. The proposed paradigm, including the proposed TSK correlation-aware architecture along with the learning method, is successfully applied to six real-world time-series predictions, regression problems, and nonlinear system identification. According to the experimental results, the performance of our proposed method outperforms other methods with a more parsimonious structure.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.