Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2306.08289
Cited By
v1
v2 (latest)
A
2
CiD
2
\textbf{A}^2\textbf{CiD}^2
A
2
CiD
2
: Accelerating Asynchronous Communication in Decentralized Deep Learning
14 June 2023
Adel Nabli
Eugene Belilovsky
Edouard Oyallon
Re-assign community
ArXiv (abs)
PDF
HTML
Papers citing
"$\textbf{A}^2\textbf{CiD}^2$: Accelerating Asynchronous Communication in Decentralized Deep Learning"
5 / 5 papers shown
Title
Accelerating AllReduce with a Persistent Straggler
Arjun Devraj
Eric Ding
Abhishek Vijaya Kumar
Robert Kleinberg
Rachee Singh
56
0
0
29 May 2025
Accelerating MoE Model Inference with Expert Sharding
Oana Balmau
Anne-Marie Kermarrec
Rafael Pires
André Loureiro Espírito Santo
M. Vos
Milos Vujasinovic
MoE
96
0
0
11 Mar 2025
DRACO: Decentralized Asynchronous Federated Learning over Row-Stochastic Wireless Networks
Eunjeong Jeong
Marios Kountouris
105
1
0
19 Jun 2024
WASH: Train your Ensemble with Communication-Efficient Weight Shuffling, then Average
Louis Fournier
Adel Nabli
Masih Aminbeidokhti
M. Pedersoli
Eugene Belilovsky
Edouard Oyallon
MoMe
FedML
95
3
0
27 May 2024
Meta-learning Optimizers for Communication-Efficient Learning
Charles-Étienne Joseph
Benjamin Thérien
A. Moudgil
Boris Knyazev
Eugene Belilovsky
137
2
0
02 Dec 2023
1