It is well-documented that cross-layer connections in feedforward small-world neural networks (FSWNNs) enhance the efficient transmission for gradients, thus improving its generalization ability with a fast learning. However, the merits of long-distance cross-layer connections are not fully utilized due to the random rewiring. In this study, aiming to further improve the learning efficiency, a fast FSWNN (FFSWNN) is proposed by taking into account the positive effects of long-distance cross-layer connections, and applied to nonlinear system modeling. First, a novel rewiring rule by giving priority to long-distance cross-layer connections is proposed to increase the gradient transmission efficiency when constructing FFSWNN. Second, an improved ridge regression method is put forward to determine the initial weights with high activation for the sigmoidal neurons in FFSWNN. Finally, to further improve the learning efficiency, an asynchronous learning algorithm is designed to train FFSWNN, with the weights connected to the output layer updated by the ridge regression method and other weights by the gradient descent method. Several experiments are conducted on four benchmark datasets from the University of California Irvine (UCI) machine learning repository and two datasets from real-life problems to evaluate the performance of FFSWNN on nonlinear system modeling. The results show that FFSWNN has significantly faster convergence speed and higher modeling accuracy than the comparative models, and the positive effects of the novel rewiring rule, the improved weight initialization, and the asynchronous learning algorithm on learning efficiency are demonstrated.
Read full abstract