ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2410.12229
81
3
v1v2v3 (latest)

Comprehending Knowledge Graphs with Large Language Models for Recommender Systems

16 October 2024
Ziqiang Cui
Yunpeng Weng
Xing Tang
Fuyuan Lyu
Dugang Liu
Xiuqiang He
Chen Ma
ArXiv (abs)PDFHTML
Abstract

In recent years, the introduction of knowledge graphs (KGs) has significantly advanced recommender systems by facilitating the discovery of potential associations between items. However, existing methods still face several limitations. First, most KGs suffer from missing facts or limited scopes. Second, existing methods convert textual information in KGs into IDs, resulting in the loss of natural semantic connections between different items. Third, existing methods struggle to capture high-order connections in the global KG. To address these limitations, we propose a novel method called CoLaKG, which leverages large language models (LLMs) to improve KG-based recommendations. The extensive world knowledge and remarkable reasoning capabilities of LLMs enable our method to supplement missing facts in KGs. Additionally, their powerful text understanding abilities allow for better utilization of semantic information. Specifically, CoLaKG extracts useful information from the KG at both local and global levels. By employing item-centered subgraph extraction and prompt engineering, it accurately captures the local KG. Subsequently, through retrieval-based neighbor enhancement, it supplements the current item by capturing related items from the entire KG, thereby effectively utilizing global information. The local and global information extracted by the LLM are effectively integrated into the recommendation model through a representation fusion module and a retrieval-augmented representation learning module, respectively, thereby improving recommendation performance. Extensive experiments on four real-world datasets demonstrate the superiority of our method.

View on arXiv
@article{cui2025_2410.12229,
  title={ Comprehending Knowledge Graphs with Large Language Models for Recommender Systems },
  author={ Ziqiang Cui and Yunpeng Weng and Xing Tang and Fuyuan Lyu and Dugang Liu and Xiuqiang He and Chen Ma },
  journal={arXiv preprint arXiv:2410.12229},
  year={ 2025 }
}
Comments on this paper