ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2408.06778
65
0
v1v2v3v4 (latest)

Fast-and-Frugal Text-Graph Transformers are Effective Link Predictors

13 August 2024
Andrei Catalin Coman
Christos Theodoropoulos
Marie-Francine Moens
James Henderson
ArXiv (abs)PDFHTML
Main:8 Pages
2 Figures
Bibliography:5 Pages
6 Tables
Appendix:1 Pages
Abstract

Link prediction models can benefit from incorporating textual descriptions of entities and relations, enabling fully inductive learning and flexibility in dynamic graphs. We address the challenge of also capturing rich structured information about the local neighbourhood of entities and their relations, by introducing a Transformer-based approach that effectively integrates textual descriptions with graph structure, reducing the reliance on resource-intensive text encoders. Our experiments on three challenging datasets show that our Fast-and-Frugal Text-Graph (FnF-TG) Transformers achieve superior performance compared to the previous state-of-the-art methods, while maintaining efficiency and scalability.

View on arXiv
@article{coman2025_2408.06778,
  title={ Fast-and-Frugal Text-Graph Transformers are Effective Link Predictors },
  author={ Andrei C. Coman and Christos Theodoropoulos and Marie-Francine Moens and James Henderson },
  journal={arXiv preprint arXiv:2408.06778},
  year={ 2025 }
}
Comments on this paper