In this paper, we extend the original work on multiresolution learning for neural networks, and present new developments on the multiresolution learning paradigm. The contributions of this paper include: (1) proposing a new concept and method of adjustable neural activation functions in multiresolution learning to improve neural network learning efficacy and generalization performance for signal predictions; (2) providing new insightful explanations for the multiresolution learning paradigm from a multiresolution optimization perspective; (3) exploring underlying ideas why the multiresolution learning scheme associated with adjustable activation functions would be more appropriate for the multiresolution learning paradigm; and (4) providing rigorous validations to evaluate the multiresolution learning paradigm with adjustable activation functions and comparing it with the schemes of multiresolution learning with fixed activation functions and traditional learning. This paper presents systematically new analytical and experimental results on the multiresolution learning approach for training an individual neural network model, demonstrates our integral solution on neural network learning efficacy, and illustrates the significant improvements on neural networks' generalization performance and robustness for nonlinear signal predictions.
Read full abstract