Abstract

To enhance the sparseness of the network, improve its generalization ability and accelerate its training, we propose a novel pruning approach for sigma-pi-sigma neural network (SPSNN) under the relaxed condition by adding smoothing group L1/2 regularization and adaptive momentum. The main strength of this method is that it can prune both the redundant nodes between groups in the network, and also the redundant weights of the non-redundant nodes within the group, so as to achieve the sparseness of the network. Another strength is that the non-smooth absolute value function in the traditional L1/2 regularization method is replaced by a smooth function. This reduces the oscillations of learning and enables us to more effectively prove the convergence of the proposed algorithm. Finally, the numerical simulation results demonstrate the effectiveness of the proposed algorithm.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call