Abstract
This paper is concerned with the [Formula: see text]-norm ball constrained multi-task learning problem, which has received extensive attention in many research areas such as machine learning, cognitive neuroscience, and signal processing. To address the challenges of solving large-scale multi-task Lasso problems, this paper develops an inexact semismooth Newton-based augmented Lagrangian (Ssnal) algorithm. When solving the inner problems in the Ssnal algorithm, the semismooth Newton (Ssn) algorithm with superlinear or even quadratic convergence is applied. Theoretically, this paper presents the global and asymptotically superlinear local convergence of the Ssnal algorithm under standard conditions. Computationally, we derive an efficient procedure to construct the generalized Jacobian of the projector onto [Formula: see text]-norm ball, which is an important component of the Ssnal algorithm, making the computational cost in the Ssn algorithm very cheap. Comprehensive numerical experiments on the multi-task Lasso problems demonstrate that the Ssnal algorithm is more efficient and robust than several existing state-of-the-art first-order algorithms.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.