Fuzzy clustering-based neural networks (FCNNs) based on information granulation techniques have been shown to be effective Takagi-Sugeno (TS)-type fuzzy models. However, the existing FCNNs could not cope well with sequential learning tasks. In this study, we introduce incremental FCNNs (IFCNNs), which could dynamically update themselves whenever new learning data (e.g., single datum or block data) are incorporated into the dataset. Specifically, we employ dynamic (incremental) fuzzy C-means (FCMs) clustering algorithms to reveal a structure in data and divide the entire input space into several subregions. In the aforementioned partition, the dynamic FCM adaptively adjusts the position of its prototypes by using sequential data. Due to the time-sharing arrival of training data, compared with batch learning models, incremental learning methods may lose classification (prediction) accuracy. In order to tackle this challenge, we utilize quasi-fuzzy local models (QFLMs) based on modified Schmidt neural networks to replace the popular linear functions in TS-type fuzzy models to refine and enhance the ability to represent the behavior of fuzzy subspaces. Meanwhile, the recursive least square error (LSE) estimation is utilized to update the weights of QFLMs from one-by-one or block-by-block (fixed or varying block size) learning data. In addition, the L2 regularization is considered to ameliorate the deterioration of generalization abilities caused by potential overfitting when carrying out weight estimation. The proposed method leads to the construction of FCNNs in a new way, which can effectively deal with incremental data as well as deliver sound generalization capability. Extensive machine-learning datasets and a real-world application are employed to show the validity and performance of the presented methods. From the experimental results, we show that the proposal can maintain sound classification accuracy when effectively processing sequential data.
Read full abstract