Abstract
Group Lasso is a mixed l1/l2-regularization method for a block-wise sparse model that has attracted a lot of interests in statistics, machine learning, and data mining. This paper establishes the possibility of stably recovering original signals from the noisy data using the adaptive group Lasso with a combination of sufficient block-sparsity and favorable block structure of the overcomplete dictionary. The corresponding theoretical results about the solution uniqueness, support recovery and representation error bound are derived based on the properties of block-coherence and subcoherence. Compared with the theoretical results on the parametrized quadratic program of conventional sparse representation, our stability results are more general. A comparison with block-based orthogonal greedy algorithm is also presented. Numerical experiments demonstrate the validity and correctness of theoretical derivation and also show that in case of noisy situation, the adaptive group Lasso has a better reconstruction performance than the quadratic program approach if the observed sparse signals have a natural block structure.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.