Abstract
Ridge regression (RR) has been commonly used in machine learning, but is facing computational challenges in big data applications. To meet the challenges, this article develops a highly parallel new algorithm, i.e., an accelerated maximally split alternating direction method of multipliers (A-MS-ADMM), for a class of generalized RR (GRR) that allows different regularization factors for different regression coefficients. Linear convergence of the new algorithm along with its convergence ratio is established. Optimal parameters of the algorithm for the GRR with a particular set of regularization factors are derived, and a selection scheme of the algorithm parameters for the GRR with general regularization factors is also discussed. The new algorithm is then applied in the training of single-layer feedforward neural networks. Experimental results on performance validation on real-world benchmark datasets for regression and classification and comparisons with existing methods demonstrate the fast convergence, low computational complexity, and high parallelism of the new algorithm.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: IEEE transactions on neural networks and learning systems
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.