Abstract

Cortical neural connectivity exhibits a small-world (SW) network topology. However, its role in neural information processing remains unclear. In this study, we investigate the learning performance of a Deep Echo State Network (DeepESN) that incorporates the Newman-Watts-Strogatz (NWS) graph topology as a reservoir. The short path length of the SW topology allows input information to reach output nodes faster, and the clustered arrangement of this topology can expand the range of the echo state property. Additionally, in this paper, we seek to identify the best set of these key factors, such as the spectral radius of the reservoir’s weight matrix and the hyperparameters neighbors' and probabilities. We evaluate the NWS-DeepESNs on four Speech Emotion Recognition (SER) tasks, which are RAVDESS, Emo-db, SAVEE, and TESS databases. By examining the NWS-DeepESN’s performance on SER tasks, we aim to determine the role of the NWS graph topology in enhancing neural information processing. The obtained results exhibited significant improvements in recognition accuracies (weighted and unweighted) with (76.45%, 75.79%), (87.89%, 87.14%), (99%, 98.62%), and (63%, 62.24%), respectively, for RAVDESS, Emo-db, SAVEE, and TESS databases. The proposed approach improves emotion recognition capabilities and yields more accurate results compared to existing state-of-the-art approaches.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call