Abstract

Random Vector Functional Link (RVFL) is popular among researchers in many areas of machine learning. RVFL is preferred by many researchers as RVFL can produce good performance with relatively little training time. Recent works extend RVFL into deep and ensemble versions. However, RVFL does not have effective feature extraction methods commonly used in time series classification. This results in poor performance of RVFL in time series classification tasks. Also, deep RVFL is a relatively new and evolving area of research. In this paper, we present a framework that extracts features from Residual Networks (Resnet) and trains Ensemble Deep Random Vector Functional Link (edRVFL). We use features extracted from every residual block to train an ensemble of edRVFLs. We propose the following enhancements to edRVFL. Firstly, we diversity the structure of edRVFL and the direct link features to encourage diversity. Secondly, we built an ensemble of edRVFLs with the top two activation functions. Thirdly, we use two-stage tuning to save computational costs. Lastly, we perform a weighted average of all decisions made by every edRVFL. Experiments on the 55 largest UCR datasets show that using features extracted from all Residual blocks improves performance. All our proposed enhancements help improve classification accuracy or computational effort. Consequently, our proposed framework outperforms all traditional and deep learning-based time series classification methods.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call