Abstract

General fuzzy min–max neural network (GFMMNN) is one of the efficient neuro-fuzzy systems for data classification. However, one of the downsides of its original learning algorithms is the inability to handle and learn directly from mixed-attribute data without using encoding techniques. While categorical feature encoding methods can be used with the GFMMNN learning algorithms, they exhibit many shortcomings. Other improved approaches proposed in the literature are not suitable for online learning algorithms working in the dynamically changing environments without ability to retrain or access full historical data, which are usually required for many real world applications. This paper proposes an extended online learning algorithm for the GFMMNN. The proposed method can handle the datasets with both continuous and categorical features. It uses the change in the entropy values of categorical features of the samples contained in a hyperbox to determine if the current hyperbox can be expanded to include the categorical values of a new training instance. An extended architecture of the original GFMMNN and its new membership function are introduced for mixed-attribute data. Important mathematical properties of the proposed learning algorithms are also presented and proved in this paper. The extensive experiments confirmed superior and stable classification performance of the proposed approach in comparison to other relevant learning algorithms for the GFMM model.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call