ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2506.10753
99
1

Think before You Simulate: Symbolic Reasoning to Orchestrate Neural Computation for Counterfactual Question Answering

12 June 2025
Adam Ishay
Zhun Yang
Joohyung Lee
Ilgu Kang
Dongjae Lim
    NAI
ArXiv (abs)PDFHTML
Main:8 Pages
10 Figures
Bibliography:2 Pages
9 Tables
Appendix:7 Pages
Abstract

Causal and temporal reasoning about video dynamics is a challenging problem. While neuro-symbolic models that combine symbolic reasoning with neural-based perception and prediction have shown promise, they exhibit limitations, especially in answering counterfactual questions. This paper introduces a method to enhance a neuro-symbolic model for counterfactual reasoning, leveraging symbolic reasoning about causal relations among events. We define the notion of a causal graph to represent such relations and use Answer Set Programming (ASP), a declarative logic programming method, to find how to coordinate perception and simulation modules. We validate the effectiveness of our approach on two benchmarks, CLEVRER and CRAFT. Our enhancement achieves state-of-the-art performance on the CLEVRER challenge, significantly outperforming existing models. In the case of the CRAFT benchmark, we leverage a large pre-trained language model, such as GPT-3.5 and GPT-4, as a proxy for a dynamics simulator. Our findings show that this method can further improve its performance on counterfactual questions by providing alternative prompts instructed by symbolic causal reasoning.

View on arXiv
@article{ishay2025_2506.10753,
  title={ Think before You Simulate: Symbolic Reasoning to Orchestrate Neural Computation for Counterfactual Question Answering },
  author={ Adam Ishay and Zhun Yang and Joohyung Lee and Ilgu Kang and Dongjae Lim },
  journal={arXiv preprint arXiv:2506.10753},
  year={ 2025 }
}
Comments on this paper