We propose a recursive reservoir concatenation architecture in reservoir computing for salt-and-pepper noise removal. The recursive algorithm consists of two components. One is the initial network training for the recursion. Since the standard reservoir computing does not appreciate images as input data, we designed a nonlinear image-specific forward operator that can extract image features from noisy input images, which are to be mapped into a reservoir for training. The other is the recursive reservoir concatenation to further improve the reconstruction quality. Training errors decrease as more reservoirs are concatenated due to the hierarchical structure of the recursive reservoir concatenation. The proposed method outperformed most analytic or machine-learning based denoising models for salt-and-pepper noise with a training cost much lower than other neural network-based models. Reconstruction is completely parallel, in that noise in different pixels can be removed in parallel.
Read full abstract