ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2404.02300
  4. Cited By
CATGNN: Cost-Efficient and Scalable Distributed Training for Graph
  Neural Networks

CATGNN: Cost-Efficient and Scalable Distributed Training for Graph Neural Networks

2 April 2024
Xin Huang
Weipeng Zhuo
Minh Phu Vuong
Shiju Li
Jongryool Kim
Bradley Rees
Chul-Ho Lee
    GNN
ArXivPDFHTML

Papers citing "CATGNN: Cost-Efficient and Scalable Distributed Training for Graph Neural Networks"

2 / 2 papers shown
Title
Accelerating Training and Inference of Graph Neural Networks with Fast
  Sampling and Pipelining
Accelerating Training and Inference of Graph Neural Networks with Fast Sampling and Pipelining
Tim Kaler
Nickolas Stathas
Anne Ouyang
A. Iliopoulos
Tao B. Schardl
C. E. Leiserson
Jie Chen
GNN
70
53
0
16 Oct 2021
Distributed Training of Deep Neural Networks: Theoretical and Practical
  Limits of Parallel Scalability
Distributed Training of Deep Neural Networks: Theoretical and Practical Limits of Parallel Scalability
J. Keuper
Franz-Josef Pfreundt
GNN
55
97
0
22 Sep 2016
1