In this study, we develop a recurrent neural network-induced Gaussian process (RNNGP) to model sequence data. We derive the equivalence between infinitely wide neural networks and Gaussian processes (GPs) for a relaxed recurrent neural network (RNN) with untied weights. We compute the covariance function of the RNNGP using an analytical iteration formula derived through the RNN procedure with an error-function-based activation function. To simplify our discussion, we use the RNNGP to perform Bayesian inference on vanilla RNNs for various problems, such as Modified National Institute of Standards and Technology digit identification, Mackey–Glass time-series forecasting, and lithium-ion battery state-of-health estimation. The results demonstrate the flexibility of the RNNGP in modeling sequence data. Furthermore, the RNNGP predictions typically outperform those of the original RNNs and GPs, demonstrating the efficiency of the RNNGP as a data-driven model. Moreover, the RNNGP can quantify the uncertainty in the predictions, which implies the significant potential of the RNNGP in uncertainty quantification analyses.