Abstract

Stochastic configuration network (SCN) is an incremental learning approach with universal approximation property for analyzing high-dimensional and large-scale data. It can start with a small structure and gradually increase hidden layer nodes through the supervision mechanism. Broad learning system (BLS), as a novel structure, has been widely used in various fields because of its high efficiency, generalization performance and scalability. Unlike deep learning models, BLS is proposed based on random vector functional link network (RVFLN) which can map input features to a more appropriate feature space and the mapped features are enhanced as enhancement nodes. Therefore, to enhance the feature learning ability and improve the effectiveness of SCN, inspired by BLS architecture, a novel broad stochastic configuration network (BSCN) is proposed, then we give the structure and theory of BSCN. In this model, the original features are transformed into mapping features in the feature layer and the mapping features are enhanced in the enhancement layer, then the input weights and biases of enhancement nodes are determined according to the supervision mechanism and the output weight matrix can be calculated through the standard least squares. The experimental results on function approximation problems and real-world datasets indicate that BSCN improves the performance of SCN, and it has higher regression accuracy and stability.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call