Skip to main content

Research Repository

Advanced Search

Asynchronous Decentralized SGD with Quantized and Local Updates

Nadiradze, Giorgi; Sabour, Amirmojtaba; Davies, Peter; Li, Shigang; Alistarh, Dan

Authors

Giorgi Nadiradze

Amirmojtaba Sabour

Shigang Li

Dan Alistarh



Abstract

Decentralized optimization is emerging as a viable alternative for scalable distributed machine learning, but also introduces new challenges in terms of synchronization costs. To this end, several communication-reduction techniques, such as non-blocking communication, quantization, and local steps, have been explored in the decentralized setting. Due to the complexity of analyzing optimization in such a relaxed setting, this line of work often assumes global communication rounds, which require additional synchronization. In this paper, we consider decentralized optimization in the simpler, but harder to analyze, asynchronous gossip model, in which communication occurs in discrete, randomly chosen pairings among nodes. Perhaps surprisingly, we show that a variant of SGD called SwarmSGD still converges in this setting, even if non-blocking communication, quantization, and local steps are all applied in conjunction, and even if the node data distributions and underlying graph topology are both heterogenous. Our analysis is based on a new connection with multi-dimensional load-balancing processes. We implement this algorithm and deploy it in a super-computing environment, showing that it can outperform previous decentralized methods in terms of end-to-end training time, and that it can even rival carefully-tuned large-batch SGD for certain tasks.

Citation

Nadiradze, G., Sabour, A., Davies, P., Li, S., & Alistarh, D. (2021, December). Asynchronous Decentralized SGD with Quantized and Local Updates. Presented at Thirty-Fifth Annual Conference on Neural Information Processing Systems (NeurIPS 2021), Online

Presentation Conference Type Conference Paper (published)
Conference Name Thirty-Fifth Annual Conference on Neural Information Processing Systems (NeurIPS 2021)
Start Date Dec 7, 2021
Acceptance Date Sep 27, 2021
Online Publication Date Nov 9, 2021
Publication Date Nov 9, 2021
Deposit Date Jan 10, 2025
Peer Reviewed Peer Reviewed
Book Title Advances in Neural Information Processing Systems 34 (NeurIPS 2021)
ISBN 9781713845393
Public URL https://durham-repository.worktribe.com/output/3329579
Publisher URL https://papers.nips.cc/paper/2021/hash/362c99307cdc3f2d8b410652386a9dd1-Abstract.html
Other Repo URL https://research-explorer.ista.ac.at/record/10435


You might also like



Downloadable Citations