Abstract

In this paper we propose a novel incremental learning approach based on a hybrid fuzzy neural net framework. A key feature of the approach is the adaptation of the fuzzy neural network (FNN) modeling to every new data. The typical algorithm of FNN is inefficient when used in an accurate online time series because they must be retrained from scratch every time the training set is modified. In order to reduce the expense of FNN learning for a dynamic system, a general methodology leading to quick algorithms for FNN modeling is developed. The FNN-LM algorithm for a static FNN and incremental learning algorithm (ILA) for dynamic fuzzy neural network (DFNN) are also presented to enforce the model to approximate every new sample. The ILA approach has the advantages of avoiding increasing the ranks of matrixes and avoiding solving the inverse matrix when samples increase gradually. When it is used to predict an accurate online time series, the DFNN model can efficiently update a trained static FNN with a very fast speed according to the sample added to the training set. Numerical experiments validate our theoretical results. Excellent performances of the proposed approach in modeling accuracy and learning convergence are exhibited.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call