Abstract

In this paper, we derive an upper bound on the generalization error of reservoir computing (RC), a special category of recurrent neural networks (RNNs). The particular RC implementation considered in this paper is the echo state network (ESN), and the generalization bound is derived via the empirical Rademacher complexity (ERC) approach. While recent work in deriving risk bounds for RC frameworks makes use of a non-standard empirical Rademacher Complexity (ERC) measure and a direct application of its definition, our work uses the standard ERC measure and tools which allow easier extension to deep RC structures and a fair comparison with other RNN structures including the vanilla RNNs. Finally, we train an ESN for the task of symbol detection, a key component in the receive processing of 5G systems and show with an example how the derived generalization bound can guide the underlying system design. To be specific, we utilize the derived generalization error together with the characterized training loss to analytically identify the optimum number of neurons in the reservoir for the symbol detection task. Simulation results using 3GPP specified channels corroborate our theoretical findings, illustrating the significance and practical relevance of our work.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call