ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2505.08155
29
0

Efficient and Scalable Neural Symbolic Search for Knowledge Graph Complex Query Answering

13 May 2025
WeiZhi Fei
Zihao W. Wang
Hang Yin
Shukai Zhao
W. Zhang
Yangqiu Song
    NAI
ArXivPDFHTML
Abstract

Complex Query Answering (CQA) aims to retrieve answer sets for complex logical formulas from incomplete knowledge graphs, which is a crucial yet challenging task in knowledge graph reasoning. While neuro-symbolic search utilized neural link predictions achieve superior accuracy, they encounter significant complexity bottlenecks: (i) Data complexity typically scales quadratically with the number of entities in the knowledge graph, and (ii) Query complexity becomes NP-hard for cyclic queries. Consequently, these approaches struggle to effectively scale to larger knowledge graphs and more complex queries. To address these challenges, we propose an efficient and scalable symbolic search framework. First, we propose two constraint strategies to compute neural logical indices to reduce the domain of variables, thereby decreasing the data complexity of symbolic search. Additionally, we introduce an approximate algorithm based on local search to tackle the NP query complexity of cyclic queries. Experiments on various CQA benchmarks demonstrate that our framework reduces the computational load of symbolic methods by 90\% while maintaining nearly the same performance, thus alleviating both efficiency and scalability issues.

View on arXiv
@article{fei2025_2505.08155,
  title={ Efficient and Scalable Neural Symbolic Search for Knowledge Graph Complex Query Answering },
  author={ Weizhi Fei and Zihao Wang and hang Yin and Shukai Zhao and Wei Zhang and Yangqiu Song },
  journal={arXiv preprint arXiv:2505.08155},
  year={ 2025 }
}
Comments on this paper