ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2505.19474
  4. Cited By
Causal-LLaVA: Causal Disentanglement for Mitigating Hallucination in Multimodal Large Language Models

Causal-LLaVA: Causal Disentanglement for Mitigating Hallucination in Multimodal Large Language Models

26 May 2025
Xinmiao Hu
C. Wang
Ruihe An
ChenYu Shao
Xiaojun Ye
Sheng Zhou
Liangcheng Li
    MLLMLRM
ArXiv (abs)PDFHTML

Papers citing "Causal-LLaVA: Causal Disentanglement for Mitigating Hallucination in Multimodal Large Language Models"

1 / 1 papers shown
Title
Hallucination of Multimodal Large Language Models: A Survey
Hallucination of Multimodal Large Language Models: A Survey
Zechen Bai
Pichao Wang
Tianjun Xiao
Tong He
Zongbo Han
Zheng Zhang
Mike Zheng Shou
VLMLRM
258
197
0
29 Apr 2024
1