Abstract

Stochastic configuration networks (SCNs) are a class of randomized learner models that ensure the universal approximation property, whereby random weights and biases are drawn from the uniform distribution and selected by a supervisory mechanism. This paper looks into the impact of the distribution of random weights on the performance of SCNs. In the light of a fundamental principle in machine learning, that is, a model with smaller parameters holds improved generalization, we recommend using symmetric zero-centered distributions in constructing SCNs to improve the generalization performance. Further, we introduce a scalar in the distributions to make the SCN model adaptively feasible to different datasets. Simulation results are reported for both regression and classification tasks over twenty-one benchmark datasets using SCN. Results are also presented on ten regression datasets using a deep implementation of SCN, known as deep stochastic configuration networks (DeepSCN).

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call