Abstract

Aiming to solve the problem of the relatively large architecture for the small-world neural network and improve its generalization ability, we propose a pruning feedforward small-world neural network based on a dynamic regularization method with the smoothing l1/2 norm (PFSWNN-DSRL1/2) and apply it to nonlinear system modeling. A feedforward small-world neural network is first constructed by the rewiring rule of Watts–Strogatz. By minimizing the modified error function added with a smoothing l1/2 norm, redundant weights are pruned to generate a sparse architecture. A dynamic adjusting strategy is further designed for the regularization strength to balance the tradeoff between the training accuracy and the sparsity. Several experiments are carried out to evaluate the performance of the proposed PFSWNN-DSRL1/2 on nonlinear system modeling. The results show that the PFSWNN-DSRL1/2 can achieve the satisfactory modeling accuracy with an average of 17% pruned weights. The comparative results demonstrate that the generalization performance of the proposed model is improved by 8.1% relative to the baseline method (FSWNN) but with a sparse structure, and the pruning does not degenerate its small-world property.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call