ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2306.01381
  4. Cited By
Adaptive Message Quantization and Parallelization for Distributed
  Full-graph GNN Training

Adaptive Message Quantization and Parallelization for Distributed Full-graph GNN Training

2 June 2023
Borui Wan
Juntao Zhao
Chuan Wu
    GNN
ArXivPDFHTML

Papers citing "Adaptive Message Quantization and Parallelization for Distributed Full-graph GNN Training"

3 / 3 papers shown
Title
Distributed Graph Neural Network Training: A Survey
Distributed Graph Neural Network Training: A Survey
Yingxia Shao
Hongzheng Li
Xizhi Gu
Hongbo Yin
Yawen Li
Xupeng Miao
Wentao Zhang
Bin Cui
Lei Chen
GNN
AI4CE
11
56
0
01 Nov 2022
Accelerating Training and Inference of Graph Neural Networks with Fast
  Sampling and Pipelining
Accelerating Training and Inference of Graph Neural Networks with Fast Sampling and Pipelining
Tim Kaler
Nickolas Stathas
Anne Ouyang
A. Iliopoulos
Tao B. Schardl
C. E. Leiserson
Jie Chen
GNN
70
53
0
16 Oct 2021
Deep Graph Library: A Graph-Centric, Highly-Performant Package for Graph
  Neural Networks
Deep Graph Library: A Graph-Centric, Highly-Performant Package for Graph Neural Networks
Minjie Wang
Da Zheng
Zihao Ye
Quan Gan
Mufei Li
...
Jun Zhao
Haotong Zhang
Alex Smola
Jinyang Li
Zheng-Wei Zhang
AI4CE
GNN
206
747
0
03 Sep 2019
1