Abstract

In this paper, a novel learning approach to train fuzzy neural networks’ parameters based on calculating the desired outputs of their rules, is proposed. We describe the desired outputs of fuzzy rules as values that make the output error equal to the minimum. To find these desired outputs, a new constrained convex optimization problem is introduced and solved. Afterward, the parameters of fuzzy rules are trained to reduce the error between the current rules’ outputs and the estimated desired ones. Therefore, the proposed learning method avoids direct output error backpropagation, which leads to vanishing gradient and consequently getting stuck in a local optimum. Therefore, the proposed method does not need any sophisticated initialization method. This learning method is successfully utilized to train a new Takagi–Sugeno-Kang (TSK) Fuzzy Neural Network with correlated fuzzy rules. The proposed paradigm, including the proposed TSK correlation-aware architecture along with the learning method, is successfully applied to six real-world time-series predictions, regression problems, and nonlinear system identification. According to the experimental results, the performance of our proposed method outperforms other methods with a more parsimonious structure.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call