ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1510.00844
  4. Cited By
Exploiting Multiple Levels of Parallelism in Sparse Matrix-Matrix
  Multiplication

Exploiting Multiple Levels of Parallelism in Sparse Matrix-Matrix Multiplication

3 October 2015
A. Azad
Grey Ballard
A. Buluç
J. Demmel
L. Grigori
O. Schwartz
Sivan Toledo
Samuel Williams
ArXivPDFHTML

Papers citing "Exploiting Multiple Levels of Parallelism in Sparse Matrix-Matrix Multiplication"

3 / 3 papers shown
Title
Sparsity-Aware Communication for Distributed Graph Neural Network Training
Sparsity-Aware Communication for Distributed Graph Neural Network Training
Ujjaini Mukhodopadhyay
Alok Tripathy
Oguz Selvitopi
Katherine Yelick
A. Buluç
98
1
0
07 Apr 2025
A Framework for General Sparse Matrix-Matrix Multiplication on GPUs and
  Heterogeneous Processors
A Framework for General Sparse Matrix-Matrix Multiplication on GPUs and Heterogeneous Processors
Weifeng Liu
B. Vinter
36
92
0
20 Apr 2015
Parallel Sparse Matrix-Matrix Multiplication and Indexing:
  Implementation and Experiments
Parallel Sparse Matrix-Matrix Multiplication and Indexing: Implementation and Experiments
A. Buluç
J. Gilbert
65
203
0
16 Sep 2011
1