Abstract

Neural networks with random weights have the advantage of fast computational time in both training and testing. However, one of the main challenges of single layer feedforward neural networks is the selection of the optimal number of neurons in the hidden layer, since few/many neurons lead to problems of underfitting/overfitting. Adapting Garson's algorithm, this paper introduces a new efficient and fast non-iterative algorithm for the selection of neurons in the hidden layer for randomization based neural networks. The proposed approach is divided into three steps: (1) train the network with h hidden neurons, (2) apply Garson's algorithm to the matrix of the hidden layer, and (3) perform pruning reducing hidden layer neurons based on the harmonic mean. Our experiments in regression and classification problems confirmed that the combination of the pruning technique with these types of neural networks improved their predictive performance in terms of mean square error and accuracy. Additionally, we tested our proposed pruning method with neural networks trained under sequential learning algorithms, where Random Vector Functional Link obtained, in general, the best predictive performance compared to online sequential versions of extreme learning machines and single hidden layer neural network with random weights.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call