Abstract
Sparse recovery under nonnegativity and sum-to-one constraints is a special form of the linear regression problem, where the solution is required to simultaneously satisfy sparsity, nonnegativity, and sum-to-one restraints. Existing algorithms for this task mainly utilize the penalty technique to convert the sparsity constraint into a regularization term. Therefore, the sparsity is determined via tuning the associated penalty parameter, which is time-consuming in practice. This paper exploits projected gradient descent to directly tackle the constrained problem without involving the penalty parameter and ℓ0-norm approximation. The addition of the ℓ0-norm constraint with a specific upper bound enables the proposed algorithm to explicitly control sparsity. The developed method is termed as modified iterative hard thresholding (MIHT), comprised of two iterative steps, namely, gradient descent and nonconvex projection. For the latter, the constraint set consists of the ℓ0-norm, nonnegativity, and sum-to-one constraints. We devise an efficient algorithm to address the nonconvex projection and then prove that this method produces an optimal solution. Furthermore, we establish the convergence of the MIHT, including objective value and variable sequence. Numerical experiments using financial and hyperspectral data demonstrate that the MIHT is superior to state-of-the-art methods in terms of prediction error and recovery accuracy.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.