ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2308.05199
  4. Cited By
gZCCL: Compression-Accelerated Collective Communication Framework for
  GPU Clusters

gZCCL: Compression-Accelerated Collective Communication Framework for GPU Clusters

9 August 2023
Jiajun Huang
Sheng Di
Xiaodong Yu
Yujia Zhai
Jinyang Liu
Yafan Huang
Kenneth Raffenetti
Hui Zhou
Kai Zhao
Xiaoyi Lu
Zizhong Chen
Franck Cappello
Yan-Hua Guo
R. Thakur
ArXivPDFHTML

Papers citing "gZCCL: Compression-Accelerated Collective Communication Framework for GPU Clusters"

4 / 4 papers shown
Title
SDP4Bit: Toward 4-bit Communication Quantization in Sharded Data
  Parallelism for LLM Training
SDP4Bit: Toward 4-bit Communication Quantization in Sharded Data Parallelism for LLM Training
Jinda Jia
Cong Xie
Hanlin Lu
Daoce Wang
Hao Feng
...
Baixi Sun
Yanghua Peng
Zhi-Li Zhang
Xin Liu
Dingwen Tao
MQ
30
4
0
20 Oct 2024
HoSZp: An Efficient Homomorphic Error-bounded Lossy Compressor for
  Scientific Data
HoSZp: An Efficient Homomorphic Error-bounded Lossy Compressor for Scientific Data
Tripti Agarwal
Sheng Di
Jiajun Huang
Yafan Huang
Ganesh Gopalakrishnan
Robert Underwood
Kai Zhao
Xin Liang
Guanpeng Li
Franck Cappello
21
3
0
21 Aug 2024
A Survey on Error-Bounded Lossy Compression for Scientific Datasets
A Survey on Error-Bounded Lossy Compression for Scientific Datasets
Sheng Di
Jinyang Liu
Kai Zhao
Xin Liang
Robert Underwood
...
Jon C. Calhoun
Guanpeng Li
Kazutomo Yoshii
Khalid Ayed Alharthi
Franck Cappello
AI4CE
42
14
0
03 Apr 2024
An Efficient Statistical-based Gradient Compression Technique for
  Distributed Training Systems
An Efficient Statistical-based Gradient Compression Technique for Distributed Training Systems
A. Abdelmoniem
Ahmed Elzanaty
Mohamed-Slim Alouini
Marco Canini
63
75
0
26 Jan 2021
1