Abstract

A major limitation of fuzzy or neuro-fuzzy systems is their failure to deal with high-dimensional datasets. This happens primarily due to the use of T-norm, particularly, product or minimum (or a softer version of it). Thus, there are hardly any work dealing with datasets having features more than hundred or so. Here, we propose a neuro-fuzzy framework that can handle datasets with even more than 7000 features! In this context, we propose an adaptive softmin (Ada-softmin) which effectively overcomes the drawbacks of “numeric underflow” and “fake minimum” that arise for existing fuzzy systems while dealing with high-dimensional problems. We call it an Adaptive Takagi-Sugeno-Kang (AdaTSK) fuzzy system. We then equip the AdaTSK system to perform feature selection and rule extraction in an integrated manner. In this context, a novel gate function is introduced and embedded only in the consequent parts, which can determine the useful features and rules, in two successive phases of learning. Unlike conventional fuzzy rule bases, we design an enhanced fuzzy rule base (En-FRB), which maintains adequate rules but does not grow the number of rules exponentially with features that typically happens for fuzzy neural networks. The integrated Feature Selection and Rule Extraction AdaTSK (FSRE-AdaTSK) system consists of three sequential phases: (i) feature selection, (ii) rule extraction, and (iii) fine tuning. The effectiveness of the FSRE-AdaTSK is demonstrated on 19 datasets of which five are in more than 2000 dimension including two with more than 7000 features. This may be the first attempt to develop fuzzy rule based classifiers that can directly deal with more than 7000 features without requiring separate selection of features or any other dimensionality reduction method.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call