ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2205.00119
  4. Cited By
MiCS: Near-linear Scaling for Training Gigantic Model on Public Cloud
v1v2v3v4v5 (latest)

MiCS: Near-linear Scaling for Training Gigantic Model on Public Cloud

30 April 2022
Zhen Zhang
Shuai Zheng
Yida Wang
Justin Chiu
George Karypis
Trishul Chilimbi
Mu Li
Xin Jin
ArXiv (abs)PDFHTML

Papers citing "MiCS: Near-linear Scaling for Training Gigantic Model on Public Cloud"

3 / 3 papers shown
Title
Scaling Large Language Model Training on Frontier with Low-Bandwidth Partitioning
Scaling Large Language Model Training on Frontier with Low-Bandwidth Partitioning
Lang Xu
Quentin G. Anthony
Jacob Hatef
Hari Subramoni
Hari Subramoni
Dhabaleswar K.
Panda
103
0
0
08 Jan 2025
Cyclic Data Parallelism for Efficient Parallelism of Deep Neural
  Networks
Cyclic Data Parallelism for Efficient Parallelism of Deep Neural Networks
Louis Fournier
Edouard Oyallon
99
0
0
13 Mar 2024
Slapo: A Schedule Language for Progressive Optimization of Large Deep
  Learning Model Training
Slapo: A Schedule Language for Progressive Optimization of Large Deep Learning Model Training
Hongzheng Chen
Cody Hao Yu
Shuai Zheng
Zhen Zhang
Zhiru Zhang
Yida Wang
84
8
0
16 Feb 2023
1