Abstract

Aiming at simplifying the network structure of broad learning system (BLS), this article proposes a novel simplification method called compact BLS (CBLS). Groups of nodes play an important role in the modeling process of BLS, and it means that there may be a correlation between nodes. The proposed CBLS not only focuses on the compactness of network structure but also pays closer attention to the correlation between nodes. Learning from the idea of Fused Lasso and Smooth Lasso, it uses the L1 -regularization term and the fusion term to penalize each output weight and the difference between adjacent output weights, respectively. The L1 -regularization term determines the correlation between the nodes and the outputs, whereas the fusion term captures the correlation between nodes. By optimizing the output weights iteratively, the correlation between the nodes and the outputs and the correlation between nodes are attempted to be considered in the simplification process simultaneously. Without reducing the prediction accuracy, finally, the network structure is simplified more reasonably and a sparse and smooth output weights solution is provided, which can reflect the characteristic of group learning of BLS. Furthermore, according to the fusion terms used in Fused Lasso and Smooth Lasso, two different simplification strategies are developed and compared. Multiple experiments based on public datasets are used to demonstrate the feasibility and effectiveness of the proposed methods.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.