Abstract

This paper proposes a CNN-based born-again Takagi-Sugeno-Kang (TSK) fuzzy classifier denoted as CNNBaTSK. CNNBaTSK achieves the following distinctive characteristics: 1) CNNBaTSK provides a new perspective of knowledge distillation with a non-iterative learning method (least learning machine with knowledge distillation, LLM-KD) to solve the consequent parameters of fuzzy rule, where consequent parameters are trained jointly on the ground-truth label loss, knowledge distillation loss, and regularization term; 2) with the inherent advantage of the fuzzy rule, CNNBaTSK has the capability to express the dark knowledge acquired from the CNN in an interpretable manner. Specifically, the dark knowledge (soft label information) is partitioned into five fixed antecedent fuzzy spaces. The centers of each soft label information in different fuzzy rules are {0, 0.25, 0.5, 0.75, 1}, which may have corresponding linguistic explanations: { <italic xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">very low, low, medium, high, very high</i> }. For the consequent part of the fuzzy rule, the original features are employed to train the consequent parameters that ensure the direct interpretability in the original feature space. The experimental results on the benchmark datasets and the CHB-MIT EEG dataset demonstrate that CNNBaTSK can simultaneously improve the classification performance and model interpretability.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.