28
1

Repetition Makes Perfect: Recurrent Sum-GNNs Match Message Passing Limit

Abstract

We provide first tight bounds for the expressivity of Recurrent Graph Neural Networks (recurrent GNNs) with finite-precision parameters. We prove that recurrent GNNs, with sum aggregation and ReLU activation, can emulate any graph algorithm that respects the natural message-passing invariance induced by the color refinement (or Weisfeiler-Leman) algorithm. While it is well known that the expressive power of GNNs is limited by this invariance [Morris et al., AAAI 2019; Xu et al., ICLR 2019], we establish that recurrent GNNs can actually reach this limit. This is in contrast to non-recurrent GNNs, which have the power of Weisfeiler-Leman only in a very weak, "non-uniform", sense where every graph size requires a different GNN model to compute with. The emulation we construct introduces only a polynomial overhead in both time and space.Furthermore, we show that by incorporating random initialization, recurrent GNNs can emulate all graph algorithms, implying in particular that any graph algorithm with polynomial-time complexity can be emulated by a recurrent GNN with random initialization, running in polynomial time.

View on arXiv
@article{rosenbluth2025_2505.00291,
  title={ Repetition Makes Perfect: Recurrent Sum-GNNs Match Message Passing Limit },
  author={ Eran Rosenbluth and Martin Grohe },
  journal={arXiv preprint arXiv:2505.00291},
  year={ 2025 }
}
Comments on this paper