Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
1705.09786
Cited By
AMPNet: Asynchronous Model-Parallel Training for Dynamic Neural Networks
27 May 2017
Alexander L. Gaunt
Matthew W. Johnson
M. Riechert
Daniel Tarlow
Ryota Tomioka
Dimitrios Vytiniotis
Sam Webster
BDL
Re-assign community
ArXiv
PDF
HTML
Papers citing
"AMPNet: Asynchronous Model-Parallel Training for Dynamic Neural Networks"
10 / 10 papers shown
Title
PipeOptim: Ensuring Effective 1F1B Schedule with Optimizer-Dependent Weight Prediction
Lei Guan
Dongsheng Li
Jiye Liang
Wenjian Wang
Wenjian Wang
Xicheng Lu
20
1
0
01 Dec 2023
Assessing Hidden Risks of LLMs: An Empirical Study on Robustness, Consistency, and Credibility
Wen-song Ye
Mingfeng Ou
Tianyi Li
Yipeng Chen
Xuetao Ma
...
Sai Wu
Jie Fu
Gang Chen
Haobo Wang
J. Zhao
42
36
0
15 May 2023
DISCO: Distributed Inference with Sparse Communications
Minghai Qin
Chaowen Sun
Jaco A. Hofmann
D. Vučinić
FedML
25
1
0
22 Feb 2023
DistrEdge: Speeding up Convolutional Neural Network Inference on Distributed Edge Devices
Xueyu Hou
Yongjie Guan
Tao Han
Ning Zhang
14
41
0
03 Feb 2022
Hydra: A System for Large Multi-Model Deep Learning
Kabir Nagrecha
Arun Kumar
MoE
AI4CE
35
5
0
16 Oct 2021
Chimera: Efficiently Training Large-Scale Neural Networks with Bidirectional Pipelines
Shigang Li
Torsten Hoefler
GNN
AI4CE
LRM
77
131
0
14 Jul 2021
Pipelined Backpropagation at Scale: Training Large Models without Batches
Atli Kosson
Vitaliy Chiley
Abhinav Venigalla
Joel Hestness
Urs Koster
33
33
0
25 Mar 2020
Distributed Deep Learning for Precipitation Nowcasting
S. Samsi
Christopher J. Mattioli
Mark S. Veillette
19
23
0
28 Aug 2019
Declarative Recursive Computation on an RDBMS, or, Why You Should Use a Database For Distributed Machine Learning
Dimitrije Jankov
Shangyu Luo
Binhang Yuan
Zhuhua Cai
Jia Zou
C. Jermaine
Zekai J. Gao
15
60
0
25 Apr 2019
Demystifying Parallel and Distributed Deep Learning: An In-Depth Concurrency Analysis
Tal Ben-Nun
Torsten Hoefler
GNN
30
701
0
26 Feb 2018
1