Abstract

Deep convolutional neural networks (CNNs) have successfully addressed numerous challenging real-world problems and are known to be more complex and powerful classifiers. However, finding a simple CNN architecture to properly estimate an unknown solution is still a challenging problem. With the objective of regularizing networks at the group level instead of individual ones, we propose the sparse smooth Group L0∘L1/2 regularization method, which is formulated as a composition of L0 and L1/2 regularizers. In doing so, both the L0 and L1/2 regularization terms are nonsmooth, which leads to difficulties in the training process. To overcome this issue, we propose a novel smooth function approximating the L1/2 regularizer. Additionally, we use an existing smooth indicator function to approximate the L0 regularizer. Numerical simulation results on a range of benchmark datasets and various CNN architectures demonstrate the effectiveness of the proposed method. It outperforms well-known group sparse regularization algorithms in several aspects, particularly in achieving superior group and weight sparsity.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call