Abstract

In this paper, we derive a theoretical upper bound on the generalization error of reservoir computing (RC), a special category of recurrent neural networks (RNNs). The specific RC implementation considered in this paper is the echo state network (ESN), and an upper bound on its generalization error is derived via the empirical Rademacher complexity (ERC) approach. While recent work in deriving risk bounds for RC frameworks makes use of a non-standard ERC measure and a direct application of its definition, our work uses the standard ERC measure and tools allowing fair comparison with conventional RNNs. The derived result shows that the generalization error bound obtained for ESNs is tighter than the existing bound for vanilla RNNs, suggesting easier generalization for ESNs. With the ESN applied to symbol detection in MIMO-OFDM (Multiple Input Multiple Output-Orthogonal Frequency Division Multiplexing) systems, we show how the derived generalization error bound can guide underlying system design. Specifically, the derived bound together with the empirically characterized training loss is utilized to identify the optimum reservoir size in neurons for the ESN-based symbol detector. Finally, we corroborate our theoretical findings with results from simulations that employ 3GPP standards-compliant wireless channels, signifying the practical relevance of our work.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.