Abstract

AbstractIn recent years, classification with big data sets has become one of the latest research topic in machine learning. Distributed classification have received much attention from industry and academia. Recently, the Alternating Direction Method of Multipliers (ADMM) is a widely-used method to solve learning problems in a distributed manner due to its simplicity and scalability. However, distributed ADMM usually converges slowly and thus suffers from expensive time cost in practice. To overcome this limitation, we propose a novel distributed stochastic ADMM (DS-ADMM) algorithm for big data classification based on the MPI framework. By formulating the original problem as a series of sub-problems through a cluster of multiple computers (nodes). In particular, we exploit a stochastic method for sub-problem optimization in parallel to further improve time efficiency. The experimental results show that our proposed distributed algorithm is suitable to enhance the performance of ADMM, and can be effectively applied for big data classification.KeywordsBig dataADMMStochastic ADMMDistributed classification

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.