Abstract

In this paper, we derive a theoretical upper bound on the generalization error of reservoir computing (RC), a special category of recurrent neural networks (RNNs). The specific RC implementation considered in this paper is the echo state network (ESN), and an upper bound on its generalization error is derived via the empirical Rademacher complexity (ERC) approach. While recent work in deriving risk bounds for RC frameworks makes use of a non-standard ERC measure and a direct application of its definition, our work uses the standard ERC measure and tools allowing fair comparison with conventional RNNs. The derived result shows that the generalization error bound obtained for ESNs is tighter than the existing bound for vanilla RNNs, suggesting easier generalization for ESNs. With the ESN applied to symbol detection in MIMO-OFDM (Multiple Input Multiple Output-Orthogonal Frequency Division Multiplexing) systems, we show how the derived generalization error bound can guide underlying system design. Specifically, the derived bound together with the empirically characterized training loss is utilized to identify the optimum reservoir size in neurons for the ESN-based symbol detector. Finally, we corroborate our theoretical findings with results from simulations that employ 3GPP standards-compliant wireless channels, signifying the practical relevance of our work.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call