Abstract
Online sequential extreme learning machine (OS-ELM) and its variants provide a promising way to solve data stream problems, but most of them do not take the timeliness of the problems into account, which may degrade the performance of the model. The main reason is that these algorithms are unable to adapt to the latest data accordingly when the distribution of the data stream changes. To mitigate this limitation, the forgetting factor is introduced into the relevant models, which is used to balance the relative importance of past data and new data when necessary. However, there is no efficient way to set the forgetting factor properly so far. In this paper, we have developed a novel updating strategy for setting the forgetting factor and proposed a dynamic forgetting factor based OS-ELM algorithm (DOS-ELM). In the sequential learning phase of DOS-ELM, the forgetting factor can be adjusted dynamically according to the change degree of the model accuracy in each learning epoch. This updating process does not require setting any parameters artificially and thus greatly improves the flexibility of the model. The experimental results on ten classification problems, five regression problems, one time-series problem show that DOS-ELM can deal well with both stationary and non-stationary data stream problems. In addition, we have extended DOS-ELM to an online deep model named ML-DOS-ELM, which can handle more complex tasks such as the face recognition problem and the handwritten digit recognition problem. Our experimental evaluations show that both DOS-ELM and ML-DOS-ELM can achieve higher prediction accuracy compared to the other similar algorithms.
Highlights
Artificial neural networks have achieved significant breakthroughs in many fields, such as speech recognition [1], action recognition [49], image processing [2], [3], and natural language processing [4]
We focus on the algorithms with a controlling parameter named forgetting factor such as TOS-extreme learning machine (ELM) and WOS-ELM, which can balance the relative importance of new data and past data and adjust the model to pay more attention to the new data when the concept drift is detected
In dynamic forgetting factor based OS-ELM algorithm (DOS-ELM), the forgetting factor is used for balancing the relative importance of new data and past data and is adjusted automatically and dynamically according to the accuracy change of the model in each learning epoch
Summary
Artificial neural networks have achieved significant breakthroughs in many fields, such as speech recognition [1], action recognition [49], image processing [2], [3], and natural language processing [4]. 1) The forgetting factor of DOS-ELM can be automatically and dynamically adjusted according to the iterative error, avoiding the instability of the model caused by artificial hyperparameters such as the setpoint error in the WOS-ELM; 2) DOS-ELM can be extended into a deep online model with multiple hidden layers (ML-DOS-ELM), which can be used to deal with some complex tasks in the online learning scenario; 3) DOS-ELM provides a unified framework for dealing with classification, regression, and time-series problems. Compared with other types of concept drift, the change of data stream is relatively stable in the case of gradual concept Considering this characteristic, we choose the arctangent function (denoted as atan(·)) to calculate the value of the forgetting factor according to the change degree (denoted as E) of the model accuracy in each learning epoch, because the arctangent function has good smoothness.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.