30
1

Towards Graph Foundation Models: Training on Knowledge Graphs Enables Transferability to General Graphs

Abstract

Inspired by the success of large language models, there is a trend toward developing graph foundation models to conduct diverse downstream tasks in various domains. However, current models often require extra fine-tuning to apply their learned structural and semantic representations to new graphs, which limits their versatility. Recent breakthroughs in zero-shot inductive reasoning on knowledge graphs (KGs), offer us a new perspective on extending KG reasoning to general graph applications. In this paper, we introduce SCR, a unified graph reasoning framework designed to train on knowledge graphs and effectively generalize across a wide range of graph tasks and domains. We begin by designing the task-specific KG structures to establish a unified topology for different task formats. Then we propose semantic-conditioned message passing, a novel mechanism addressing the inherent semantic isolation in traditional KG reasoning, by jointly modeling structural and semantic invariance patterns in graph representations. To demonstrate the effectiveness, we evaluate the inductive reasoning capability of SCR using 38 diverse graph datasets, covering node-level, link-level, and graph-level tasks across multiple domains. Our results show substantial performance gains over existing foundation models and supervised baselines, highlighting the efficacy and adaptability of our approach.

View on arXiv
@article{wang2025_2410.12609,
  title={ Towards Graph Foundation Models: Training on Knowledge Graphs Enables Transferability to General Graphs },
  author={ Kai Wang and Siqiang Luo and Caihua Shan and Yifei Shen },
  journal={arXiv preprint arXiv:2410.12609},
  year={ 2025 }
}
Comments on this paper