230
6

Cayley Graph Propagation

Main:9 Pages
8 Figures
Bibliography:4 Pages
12 Tables
Appendix:7 Pages
Abstract

In spite of the plethora of success stories with graph neural networks (GNNs) on modelling graph-structured data, they are notoriously vulnerable to over-squashing, whereby tasks necessitate the mixing of information between distance pairs of nodes. To address this problem, prior work suggests rewiring the graph structure to improve information flow. Alternatively, a significant body of research has dedicated itself to discovering and precomputing bottleneck-free graph structures to ameliorate over-squashing. One well regarded family of bottleneck-free graphs within the mathematical community are expander graphs, with prior work\unicodex2014\unicode{x2014}Expander Graph Propagation (EGP)\unicodex2014\unicode{x2014}proposing the use of a well-known expander graph family\unicodex2014\unicode{x2014}the Cayley graphs of the SL(2,Zn)\mathrm{SL}(2,\mathbb{Z}_n) special linear group\unicodex2014\unicode{x2014}as a computational template for GNNs. However, in EGP the computational graphs used are truncated to align with a given input graph. In this work, we show that truncation is detrimental to the coveted expansion properties. Instead, we propose CGP, a method to propagate information over a complete Cayley graph structure, thereby ensuring it is bottleneck-free to better alleviate over-squashing. Our empirical evidence across several real-world datasets not only shows that CGP recovers significant improvements as compared to EGP, but it is also akin to or outperforms computationally complex graph rewiring techniques.

View on arXiv
@article{wilson2025_2410.03424,
  title={ Cayley Graph Propagation },
  author={ JJ Wilson and Maya Bechler-Speicher and Petar Veličković },
  journal={arXiv preprint arXiv:2410.03424},
  year={ 2025 }
}
Comments on this paper