ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2506.16110
5
0

Mitigating Over-Squashing in Graph Neural Networks by Spectrum-Preserving Sparsification

19 June 2025
Langzhang Liang
Fanchen Bu
Zixing Song
Zenglin Xu
Shirui Pan
Kijung Shin
ArXiv (abs)PDFHTML
Main:9 Pages
6 Figures
Bibliography:2 Pages
10 Tables
Appendix:9 Pages
Abstract

The message-passing paradigm of Graph Neural Networks often struggles with exchanging information across distant nodes typically due to structural bottlenecks in certain graph regions, a limitation known as \textit{over-squashing}. To reduce such bottlenecks, \textit{graph rewiring}, which modifies graph topology, has been widely used. However, existing graph rewiring techniques often overlook the need to preserve critical properties of the original graph, e.g., \textit{spectral properties}. Moreover, many approaches rely on increasing edge count to improve connectivity, which introduces significant computational overhead and exacerbates the risk of over-smoothing. In this paper, we propose a novel graph rewiring method that leverages \textit{spectrum-preserving} graph \textit{sparsification}, for mitigating over-squashing. Our method generates graphs with enhanced connectivity while maintaining sparsity and largely preserving the original graph spectrum, effectively balancing structural bottleneck reduction and graph property preservation. Experimental results validate the effectiveness of our approach, demonstrating its superiority over strong baseline methods in classification accuracy and retention of the Laplacian spectrum.

View on arXiv
@article{liang2025_2506.16110,
  title={ Mitigating Over-Squashing in Graph Neural Networks by Spectrum-Preserving Sparsification },
  author={ Langzhang Liang and Fanchen Bu and Zixing Song and Zenglin Xu and Shirui Pan and Kijung Shin },
  journal={arXiv preprint arXiv:2506.16110},
  year={ 2025 }
}
Comments on this paper