Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2110.09132
Cited By
EmbRace: Accelerating Sparse Communication for Distributed Training of NLP Neural Networks
18 October 2021
Shengwei Li
Zhiquan Lai
Dongsheng Li
Yiming Zhang
Xiangyu Ye
Yabo Duan
FedML
Re-assign community
ArXiv
PDF
HTML
Papers citing
"EmbRace: Accelerating Sparse Communication for Distributed Training of NLP Neural Networks"
3 / 3 papers shown
Title
Automated Tensor Model Parallelism with Overlapped Communication for Efficient Foundation Model Training
Shengwei Li
Zhiquan Lai
Yanqi Hao
Weijie Liu
Ke-shi Ge
Xiaoge Deng
Dongsheng Li
KaiCheng Lu
21
10
0
25 May 2023
Google's Neural Machine Translation System: Bridging the Gap between Human and Machine Translation
Yonghui Wu
M. Schuster
Zhehuai Chen
Quoc V. Le
Mohammad Norouzi
...
Alex Rudnick
Oriol Vinyals
G. Corrado
Macduff Hughes
J. Dean
AIMat
718
6,750
0
26 Sep 2016
Distributed Training of Deep Neural Networks: Theoretical and Practical Limits of Parallel Scalability
J. Keuper
Franz-Josef Pfreundt
GNN
55
97
0
22 Sep 2016
1