Abstract

This paper presents the framework of the smoothing group L1/2 regularized discriminative broad learning system for pattern classification and regression. The core idea is to improve the sparseness of the standard broad learning system and improve performance on recognition and generalization. First, the ε-dragging technique is introduced into the standard broad learning system to relax regression targets and enlarge distances between categories. Then, we integrate the group L1/2 regularization to optimize the network architecture to achieve sparsity. For the original group L1/2 regularization, the objective function is non-convex and non-smooth, which is hard for theoretical analysis. Therefore, we propose a simple and effective smoothing technique, i.e.,smoothing group L1/2 regularization, which can effectively eliminate the deficiency of the original group L1/2 regularization. As a result, the final weights projection matrix has a compact form and shows discriminative power capability. In addition, the alternating direction method of multipliers was adopted to optimize the algorithm. The simulation results show that the proposed algorithm has redundancy control capability and improved performance on recognition and generalization. The simulation results proves the efficiency of the theoretical analysis.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.