Abstract
Broad learning system (BLS), an efficient neural network with a flat structure, has received a lot of attention due to its advantages in training speed and network extensibility. However, the conventional BLS adopts the least square loss, which treats each sample equally and thus is sensitivity to noise and outliers. To address this concern, in this article we propose a self-paced BLS (SPBLS) model by incorporating the novel self-paced learning (SPL) strategy into the network for noisy data regression. With the assistance of the SPL criterion, the model output is used as feedback to learn appropriate priority weight to readjust the importance of each sample. Such a reweighting strategy can help SPBLS to distinguish samples from "easy" to "difficult" in model training, equipping the model robust to noise and outliers while maintaining the characteristics of the original system. Moreover, two incremental learning algorithms associated to SPBLS have also been developed, with which the system can be updated quickly and flexibly without retraining the entire model when new training samples are added or the network needs to be expanded. Experiments conducted on various datasets demonstrate that the proposed SPBLS can achieve satisfying performance for noisy data regression.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.