Abstract

Existing multitask Takagi-Sugeno-Kang (TSK) fuzzy modeling methods always produce high complex fuzzy models with numerous redundant rules and consequent parameters. To this end, we propose a novel multitask TSK fuzzy modeling method called mtSparseTSK, which learns a compact set of fuzzy rules and shared consequent parameters across tasks in a unified procedure. Specifically, we consider the fuzzy rule reduction and consequent parameter selection across tasks by devising novel group sparsity regularizations in the learning criterion of the model. We also integrate the intertask relations in the proposed TSK model for multitask learning. We fully utilize the block structure in the TSK fuzzy models in formulating a joint block sparse optimization problem and develop a procedure for alternating direction method of multipliers (ADMMs) to find the optimal solution of the problem. Experiments on the synthetic and real-world datasets demonstrate the distinctive performance of the proposed methods over the existing ones on multitask fuzzy system modeling.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call