ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2505.15842
22
0

AH-UGC: Adaptive and Heterogeneous-Universal Graph Coarsening

18 May 2025
Mohit Kataria
Shreyash Bhilwade
Sandeep Kumar
Jayadeva
ArXivPDFHTML
Abstract

Graph Coarsening (GC)\textbf{Graph Coarsening (GC)}Graph Coarsening (GC) is a prominent graph reduction technique that compresses large graphs to enable efficient learning and inference. However, existing GC methods generate only one coarsened graph per run and must recompute from scratch for each new coarsening ratio, resulting in unnecessary overhead. Moreover, most prior approaches are tailored to homogeneous\textit{homogeneous}homogeneous graphs and fail to accommodate the semantic constraints of heterogeneous\textit{heterogeneous}heterogeneous graphs, which comprise multiple node and edge types. To overcome these limitations, we introduce a novel framework that combines Locality Sensitive Hashing (LSH) with Consistent Hashing to enable adaptive graph coarsening\textit{adaptive graph coarsening}adaptive graph coarsening. Leveraging hashing techniques, our method is inherently fast and scalable. For heterogeneous graphs, we propose a type isolated coarsening\textit{type isolated coarsening}type isolated coarsening strategy that ensures semantic consistency by restricting merges to nodes of the same type. Our approach is the first unified framework to support both adaptive and heterogeneous coarsening. Extensive evaluations on 23 real-world datasets including homophilic, heterophilic, homogeneous, and heterogeneous graphs demonstrate that our method achieves superior scalability while preserving the structural and semantic integrity of the original graph.

View on arXiv
@article{kataria2025_2505.15842,
  title={ AH-UGC: Adaptive and Heterogeneous-Universal Graph Coarsening },
  author={ Mohit Kataria and Shreyash Bhilwade and Sandeep Kumar and Jayadeva },
  journal={arXiv preprint arXiv:2505.15842},
  year={ 2025 }
}
Comments on this paper