ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1910.02270
  4. Cited By
Parallelizing Training of Deep Generative Models on Massive Scientific
  Datasets

Parallelizing Training of Deep Generative Models on Massive Scientific Datasets

5 October 2019
S. A. Jacobs
B. Van Essen
D. Hysom
Jae-Seung Yeom
Tim Moon
Rushil Anirudh
Jayaraman J. Thiagarajan
Shusen Liu
P. Bremer
J. Gaffney
Tom Benson
Peter B. Robinson
L. Peterson
B. Spears
    BDLAI4CE
ArXiv (abs)PDFHTML

Papers citing "Parallelizing Training of Deep Generative Models on Massive Scientific Datasets"

4 / 4 papers shown
Title
Formal Definitions and Performance Comparison of Consistency Models for
  Parallel File Systems
Formal Definitions and Performance Comparison of Consistency Models for Parallel File Systems
Chen Wang
Kathryn M. Mohror
Marc Snir
27
1
0
21 Feb 2024
SOLAR: A Highly Optimized Data Loading Framework for Distributed
  Training of CNN-based Scientific Surrogates
SOLAR: A Highly Optimized Data Loading Framework for Distributed Training of CNN-based Scientific Surrogates
Baixi Sun
Xiaodong Yu
Chengming Zhang
Jiannan Tian
Sian Jin
K. Iskra
Tao Zhou
Tekin Bicer
Pete Beckman
Dingwen Tao
58
1
0
01 Nov 2022
Clairvoyant Prefetching for Distributed Machine Learning I/O
Clairvoyant Prefetching for Distributed Machine Learning I/O
Nikoli Dryden
Roman Böhringer
Tal Ben-Nun
Torsten Hoefler
79
58
0
21 Jan 2021
The Case for Strong Scaling in Deep Learning: Training Large 3D CNNs
  with Hybrid Parallelism
The Case for Strong Scaling in Deep Learning: Training Large 3D CNNs with Hybrid Parallelism
Yosuke Oyama
N. Maruyama
Nikoli Dryden
Erin McCarthy
P. Harrington
J. Balewski
Satoshi Matsuoka
Peter Nugent
B. Van Essen
3DVAI4CE
71
37
0
25 Jul 2020
1