Abstract

To overcome the pitfalls of Random Vector Functional Link (RVFL), a network called Stochastic Configuration Networks (SCN) has been proposed. By constraining and adaptively selecting the range of randomized parameters using the Stochastic Configuration (SC) algorithm, SCN claims to be potent in building an incremental randomized learning system according to residual error minimization. The SC has three variants depending on how the range of output weights are updated. In this work, we first relate the SCN to appropriate literature. Subsequently, we show that the major parts of the SC algorithm can be replaced by a generic hyper-parameter optimization method to obtain overall better results.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call