Abstract

The number of algorithms based on Extreme Learning Machine (ELM), which train Single Layer Feedforward Neural Networks (SLFN), has increased in the past years due to their fast training stages and good generalization performances. Many variants have been proposed, which deal with outliers, with online learning and other characteristics. Recently, Fast Deep Stacked Network (FDSN) was proposed, which is an alternative to large ELM networks with similar performances using less memory than ELM-based algorithms. Regarding online methods, which deal with data that arrives over time considering concept drift, ensemble approaches are one of the most promising categories of these methods. FDSN can be viewed as an ensemble of SLFN, where the network output is the output of the most recent SLFN. In this paper, we propose Online Sequential FDSN (OSFDSN), which is similar to FDSN, but each of its SLFN modules has a weighted contribution to the network output. These weights are dynamically calculated, according to the most recent data. We compare our method with some other similar techniques with implementations publicly available. Our conducted experiments show that the proposed technique is statistically equivalent to the others when considering RMSE, while having a faster updating scheme.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.