Giorgi Nadiradze
Asynchronous Decentralized SGD with Quantized and Local Updates
Nadiradze, Giorgi; Sabour, Amirmojtaba; Davies, Peter; Li, Shigang; Alistarh, Dan
Authors
Amirmojtaba Sabour
Dr Peter Davies-Peck peter.w.davies@durham.ac.uk
Assistant Professor
Shigang Li
Dan Alistarh
Abstract
Decentralized optimization is emerging as a viable alternative for scalable distributed machine learning, but also introduces new challenges in terms of synchronization costs. To this end, several communication-reduction techniques, such as non-blocking communication, quantization, and local steps, have been explored in the decentralized setting. Due to the complexity of analyzing optimization in such a relaxed setting, this line of work often assumes global communication rounds, which require additional synchronization. In this paper, we consider decentralized optimization in the simpler, but harder to analyze, asynchronous gossip model, in which communication occurs in discrete, randomly chosen pairings among nodes. Perhaps surprisingly, we show that a variant of SGD called SwarmSGD still converges in this setting, even if non-blocking communication, quantization, and local steps are all applied in conjunction, and even if the node data distributions and underlying graph topology are both heterogenous. Our analysis is based on a new connection with multi-dimensional load-balancing processes. We implement this algorithm and deploy it in a super-computing environment, showing that it can outperform previous decentralized methods in terms of end-to-end training time, and that it can even rival carefully-tuned large-batch SGD for certain tasks.
Citation
Nadiradze, G., Sabour, A., Davies, P., Li, S., & Alistarh, D. (2021, December). Asynchronous Decentralized SGD with Quantized and Local Updates. Presented at Thirty-Fifth Annual Conference on Neural Information Processing Systems (NeurIPS 2021), Online
Presentation Conference Type | Conference Paper (published) |
---|---|
Conference Name | Thirty-Fifth Annual Conference on Neural Information Processing Systems (NeurIPS 2021) |
Start Date | Dec 7, 2021 |
Acceptance Date | Sep 27, 2021 |
Online Publication Date | Nov 9, 2021 |
Publication Date | Nov 9, 2021 |
Deposit Date | Jan 10, 2025 |
Peer Reviewed | Peer Reviewed |
Book Title | Advances in Neural Information Processing Systems 34 (NeurIPS 2021) |
ISBN | 9781713845393 |
Public URL | https://durham-repository.worktribe.com/output/3329579 |
Publisher URL | https://papers.nips.cc/paper/2021/hash/362c99307cdc3f2d8b410652386a9dd1-Abstract.html |
Other Repo URL | https://research-explorer.ista.ac.at/record/10435 |
You might also like
Component stability in low-space massively parallel computation
(2024)
Journal Article
Optimal (degree+1)-Coloring in Congested Clique
(2023)
Presentation / Conference Contribution
Uniting General-Graph and Geometric-Based Radio Networks via Independence Number Parametrization
(2023)
Presentation / Conference Contribution
Improved Distributed Algorithms for the Lovász Local Lemma and Edge Coloring
(2023)
Presentation / Conference Contribution
Optimal Message-Passing with Noisy Beeps
(2023)
Presentation / Conference Contribution
Downloadable Citations
About Durham Research Online (DRO)
Administrator e-mail: dro.admin@durham.ac.uk
This application uses the following open-source libraries:
SheetJS Community Edition
Apache License Version 2.0 (http://www.apache.org/licenses/)
PDF.js
Apache License Version 2.0 (http://www.apache.org/licenses/)
Font Awesome
SIL OFL 1.1 (http://scripts.sil.org/OFL)
MIT License (http://opensource.org/licenses/mit-license.html)
CC BY 3.0 ( http://creativecommons.org/licenses/by/3.0/)
Powered by Worktribe © 2025
Advanced Search