Abstract

Efficient communication is crucial to Distributed Machine Learning (DML). In this work, we propose an approach jointing Data Formatting and Sparsification (DFS) to optimize the communication in DML systems based on the parameter server framework. By doing so, we can reduce the time to transmit (aggregated) gradients between the parameter server and workers, and consequently the time to complete training jobs. More specifically, in DFS, every worker first tries to derive as many blocks with all-zero gradients as possible via sparsification, and transmits gradients block by block in a streaming fashion. By skipping blocks with all-zero gradients, we can reduce the communication cost for gradient transmission. Different from previous works on optimizing the communication in DML systems, DFS has three distinct features: (i). it dynamically determines the gradient block size; (ii). it takes into consideration both the data transfer from workers to the parameter server and that from the parameter server to workers; and (iii). it jointly optimizes the data formatting and sparsification. In other words, it performs sparsification in the way that helps form more (or larger) all-zero blocks and save more communication cost. By implementing DFS on a real testbed, we find that it can reduce the time to train a ResNet-18 model by 74.12%. Through extensive simulations, we demonstrate that DFS outperforms the state-of-the-art technique, i.e., OmniReduce (Fei et al., 2021), by up to 87.17% in terms of reducing communication cost in DML systems.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.