ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2509.03531
  4. Cited By
Real-Time Detection of Hallucinated Entities in Long-Form Generation

Real-Time Detection of Hallucinated Entities in Long-Form Generation

26 August 2025
Oscar Obeso
Andy Arditi
Javier Ferrando
Joshua Freeman
Cameron Holmes
Neel Nanda
    HILM
ArXiv (abs)PDFHTMLGithub

Papers citing "Real-Time Detection of Hallucinated Entities in Long-Form Generation"

4 / 4 papers shown
Title
Grounding or Guessing? Visual Signals for Detecting Hallucinations in Sign Language Translation
Grounding or Guessing? Visual Signals for Detecting Hallucinations in Sign Language Translation
Yasser Hamidullah
Koel Dutta Chowdury
Yusser Al Ghussin
Shakib Yazdani
Cennet Oguz
Josef van Genabith
C. España-Bonet
105
0
0
21 Oct 2025
Reference-Free Rating of LLM Responses via Latent Information
Reference-Free Rating of LLM Responses via Latent Information
Leander Girrbach
Chi-Ping Su
Tankred Saanum
Richard Socher
Eric Schulz
Zeynep Akata
84
0
0
29 Sep 2025
Hallucination reduction with CASAL: Contrastive Activation Steering For Amortized Learning
Hallucination reduction with CASAL: Contrastive Activation Steering For Amortized Learning
Wannan Yang
Xinchi Qiu
L. Yu
Yuchen Zhang
Oliver Aobo Yang
Narine Kokhlikyan
Nicola Cancedda
Diego Garcia-Olano
Diego Garcia-Olano
134
0
0
25 Sep 2025
Steering MoE LLMs via Expert (De)Activation
Steering MoE LLMs via Expert (De)Activation
Mohsen Fayyaz
Ali Modarressi
Hanieh Deilamsalehy
Franck Dernoncourt
Ryan Rossi
Trung Bui
Hinrich Schutze
Nanyun Peng
MoELLMSV
156
3
0
11 Sep 2025
1