Abstract
Deep convolutional neural networks (CNNs) have successfully addressed numerous challenging real-world problems and are known to be more complex and powerful classifiers. However, finding a simple CNN architecture to properly estimate an unknown solution is still a challenging problem. With the objective of regularizing networks at the group level instead of individual ones, we propose the sparse smooth Group L0∘L1/2 regularization method, which is formulated as a composition of L0 and L1/2 regularizers. In doing so, both the L0 and L1/2 regularization terms are nonsmooth, which leads to difficulties in the training process. To overcome this issue, we propose a novel smooth function approximating the L1/2 regularizer. Additionally, we use an existing smooth indicator function to approximate the L0 regularizer. Numerical simulation results on a range of benchmark datasets and various CNN architectures demonstrate the effectiveness of the proposed method. It outperforms well-known group sparse regularization algorithms in several aspects, particularly in achieving superior group and weight sparsity.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.