Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
1910.02270
Cited By
Parallelizing Training of Deep Generative Models on Massive Scientific Datasets
5 October 2019
S. A. Jacobs
B. Van Essen
D. Hysom
Jae-Seung Yeom
Tim Moon
Rushil Anirudh
Jayaraman J. Thiagarajan
Shusen Liu
P. Bremer
J. Gaffney
Tom Benson
Peter B. Robinson
L. Peterson
B. Spears
BDL
AI4CE
Re-assign community
ArXiv (abs)
PDF
HTML
Papers citing
"Parallelizing Training of Deep Generative Models on Massive Scientific Datasets"
4 / 4 papers shown
Title
Formal Definitions and Performance Comparison of Consistency Models for Parallel File Systems
Chen Wang
Kathryn M. Mohror
Marc Snir
27
1
0
21 Feb 2024
SOLAR: A Highly Optimized Data Loading Framework for Distributed Training of CNN-based Scientific Surrogates
Baixi Sun
Xiaodong Yu
Chengming Zhang
Jiannan Tian
Sian Jin
K. Iskra
Tao Zhou
Tekin Bicer
Pete Beckman
Dingwen Tao
58
1
0
01 Nov 2022
Clairvoyant Prefetching for Distributed Machine Learning I/O
Nikoli Dryden
Roman Böhringer
Tal Ben-Nun
Torsten Hoefler
79
58
0
21 Jan 2021
The Case for Strong Scaling in Deep Learning: Training Large 3D CNNs with Hybrid Parallelism
Yosuke Oyama
N. Maruyama
Nikoli Dryden
Erin McCarthy
P. Harrington
J. Balewski
Satoshi Matsuoka
Peter Nugent
B. Van Essen
3DV
AI4CE
71
37
0
25 Jul 2020
1