ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2408.06778
65
0
v1v2v3v4 (latest)

Fast-and-Frugal Text-Graph Transformers are Effective Link Predictors

13 August 2024
Andrei Catalin Coman
Christos Theodoropoulos
Marie-Francine Moens
James Henderson
ArXiv (abs)PDFHTML
Main:8 Pages
2 Figures
Bibliography:5 Pages
6 Tables
Appendix:1 Pages
Abstract

We propose Fast-and-Frugal Text-Graph (FnF-TG) Transformers, a Transformer-based framework that unifies textual and structural information for inductive link prediction in text-attributed knowledge graphs. We demonstrate that, by effectively encoding ego-graphs (1-hop neighbourhoods), we can reduce the reliance on resource-intensive textual encoders. This makes the model both fast at training and inference time, as well as frugal in terms of cost. We perform a comprehensive evaluation on three popular datasets and show that FnF-TG can achieve superior performance compared to previous state-of-the-art methods. We also extend inductive learning to a fully inductive setting, where relations don't rely on transductive (fixed) representations, as in previous work, but are a function of their textual description. Additionally, we introduce new variants of existing datasets, specifically designed to test the performance of models on unseen relations at inference time, thus offering a new test-bench for fully inductive link prediction.

View on arXiv
@article{coman2025_2408.06778,
  title={ Fast-and-Frugal Text-Graph Transformers are Effective Link Predictors },
  author={ Andrei C. Coman and Christos Theodoropoulos and Marie-Francine Moens and James Henderson },
  journal={arXiv preprint arXiv:2408.06778},
  year={ 2025 }
}
Comments on this paper