20
0
v1v2 (latest)

GPR: Empowering Generation with Graph-Pretrained Retriever

Main:3 Pages
2 Figures
Bibliography:2 Pages
2 Tables
Appendix:2 Pages
Abstract

Graph retrieval-augmented generation (GRAG) places high demands on graph-specific retrievers. However, existing retrievers often rely on language models pretrained on plain text, limiting their effectiveness due to domain misalignment and structure ignorance. To address these challenges, we propose GPR, a graph-based retriever pretrained directly on knowledge graphs. GPR aligns natural language questions with relevant subgraphs through LLM-guided graph augmentation and employs a structure-aware objective to learn fine-grained retrieval strategies. Experiments on two datasets, three LLM backbones, and five baselines show that GPR consistently improves both retrieval quality and downstream generation, demonstrating its effectiveness as a robust retrieval solution for GRAG.

View on arXiv
@article{wang2025_2506.00261,
  title={ GPR: Empowering Generation with Graph-Pretrained Retriever },
  author={ Xiaochen Wang and Zongyu Wu and Yuan Zhong and Xiang Zhang and Suhang Wang and Fenglong Ma },
  journal={arXiv preprint arXiv:2506.00261},
  year={ 2025 }
}
Comments on this paper