In this paper, we consider various schemes of sequence resampling in reservoir computing models for nonlinear time series prediction. These schemes can enrich the features used for training the readout part with batch learning and lead to better prediction performance. To implement these schemes, first, we introduce a modular approach for constructing multi-reservoir ESN models by assembling encoding and decoding modules. The encoding module is composed of a resampling unit, a group-wise reservoir unit, and a collection unit, for extracting various features from a sequence. The decoding module is a linear regressor which is trainable to produce desired outputs. Then, we propose three novel multi-reservoir ESN models, DeepESN with Every-layer Sequence Resampling (DeepESN-ESR), DeepESN with Last-layer Sequence Resampling (DeepESN-LSR), and GroupedESN with Input-layer Sequence Resampling (GroupedESN-ISR). These three models provide demonstrations of sequence resampling on multi-reservoir ESN models. Numerical results on five challenging nonlinear time-series prediction tasks show that the proposed models outperform some state-of-the-art multi-reservoir ESN models. An evaluation of computational time shows that our proposed three models require less computational cost for learning than many existing multi-reservoir ESN models in practice. Moreover, a comprehensive comparative analysis reveals that our proposed models are able to memorize longer temporal information and generate richer dynamics from the reservoir states than some existing models. The proposed schemes for extracting various features hidden in a sequence of reservoir states can be widely leveraged in other reservoir computing systems for improving their performance in nonlinear time series prediction.