ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2501.08292
  4. Cited By
HALoGEN: Fantastic LLM Hallucinations and Where to Find Them

HALoGEN: Fantastic LLM Hallucinations and Where to Find Them

14 January 2025
Abhilasha Ravichander
Shrusti Ghela
David Wadden
Yejin Choi
    HILMLRM
ArXiv (abs)PDFHTML

Papers citing "HALoGEN: Fantastic LLM Hallucinations and Where to Find Them"

3 / 3 papers shown
Title
Why and How LLMs Hallucinate: Connecting the Dots with Subsequence Associations
Why and How LLMs Hallucinate: Connecting the Dots with Subsequence Associations
Yiyou Sun
Y. Gai
Lijie Chen
Abhilasha Ravichander
Yejin Choi
Basel Alomair
HILM
108
2
0
17 Apr 2025
Exploring Hallucination of Large Multimodal Models in Video Understanding: Benchmark, Analysis and Mitigation
Exploring Hallucination of Large Multimodal Models in Video Understanding: Benchmark, Analysis and Mitigation
Hongcheng Gao
Jiashu Qu
Jingyi Tang
Baolong Bi
Yi Liu
Hongyu Chen
Li Liang
Li Su
Qingming Huang
MLLMVLMLRM
144
6
0
25 Mar 2025
Do Multimodal Large Language Models Understand Welding?
Do Multimodal Large Language Models Understand Welding?
Grigorii Khvatskii
Yong Suk Lee
Corey Angst
Maria Gibbs
Robert Landers
Nitesh Chawla
AI4CE
74
1
0
18 Mar 2025
1