Abstract

Improving the tradeoff between accuracy and interpretability is essential for the problem of handling high-dimensional data in Takagi–Sugeno–Kang (TSK) fuzzy systems and providing insights into real-world tasks. However, the TSK fuzzy system becomes complex and challenging to interpret as the data dimension increases. Here, we report an ensemble classifier, which is an enhanced adaptive network-based fuzzy inference system (ANFIS) integrating improved bagging and dropout to build concise fuzzy rule sets. First, the high-dimensional feature space is decomposed into a series of low-dimensional feature subsets using the bagging and random subspace method to train multiple ANFISs. An improved dropout strategy is then applied in training ANFISs by temporarily disabling rules in each training epoch and deleting rules after training to obtain sparse rulesets with high-quality rules. These sub-models are subsequently aggregated to perform the fuzzy inference. Results on high-dimensional benchmark datasets confirm that both the bagging and dropout strategies are effective, providing high interpretability by reducing the co-firing degrees and rules of sub-models while guaranteeing accuracy at the same time.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call