Abstract
Representation learning impacts the performance of Machine Learning (ML) models. Feature extraction-based methods such as Auto-Encoders (AEs) are used to find new, more accurate data representations from original ones. They perform efficiently on a specific task, in terms of: (1) high accuracy, (2) large short-term memory and (3) low execution time. The Echo-State Network (ESN) is a recent specific kind of a Recurrent Neural Networks (RNN), that presents very rich dynamics on account of its reservoir-based hidden layer. It is widely used in dealing with complex non-linear problems and has been shown to outperform classical approaches in a number of benchmark tasks. In this paper, the powerful dynamism and large memory provided by the ESN and complementary strengths of AEs in feature extraction are integrated, to develop a novel Echo-State Recurrent Autoencoder (ES-RA). In order to devise more robust alternatives to conventional reservoir-based networks, both single- (SL-ES-RA) and multi-layer (ML-ES-RA) models are formulated. The new features, once extracted from ESN’s hidden layer, are applied to various benchmark ML tasks including classification, time series prediction and regression. A range of evaluation metrics are shown to improve considerably compared to those obtained when applying original data features. An accuracy-based comparison is performed between our proposed recurrent AEs and two variants of ELM feed-forward AEs (Single and ML), for both noise free and noisy data. In summary, a comparative empirical study reveals the key contribution of exploiting recurrent connections in improving benchmark performance results.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: Engineering Applications of Artificial Intelligence
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.