ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2403.06710
  4. Cited By
HILL: A Hallucination Identifier for Large Language Models

HILL: A Hallucination Identifier for Large Language Models

11 March 2024
Florian Leiser
S. Eckhardt
Valentin Leuthe
Merlin Knaeble
Alexander Maedche
Gerhard Schwabe
Ali Sunyaev
    HILM
ArXivPDFHTML

Papers citing "HILL: A Hallucination Identifier for Large Language Models"

6 / 6 papers shown
Title
Hallucinations in Large Multilingual Translation Models
Hallucinations in Large Multilingual Translation Models
Nuno M. Guerreiro
Duarte M. Alves
Jonas Waldendorf
Barry Haddow
Alexandra Birch
Pierre Colombo
André F.T. Martins
VLM
HILM
LRM
89
144
0
28 Mar 2023
Appropriate Reliance on AI Advice: Conceptualization and the Effect of
  Explanations
Appropriate Reliance on AI Advice: Conceptualization and the Effect of Explanations
Max Schemmer
Niklas Kühl
Carina Benz
Andrea Bartos
G. Satzger
30
100
0
04 Feb 2023
Towards a Science of Human-AI Decision Making: A Survey of Empirical
  Studies
Towards a Science of Human-AI Decision Making: A Survey of Empirical Studies
Vivian Lai
Chacha Chen
Q. V. Liao
Alison Smith-Renner
Chenhao Tan
57
188
0
21 Dec 2021
The Curious Case of Hallucinations in Neural Machine Translation
The Curious Case of Hallucinations in Neural Machine Translation
Vikas Raunak
Arul Menezes
Marcin Junczys-Dowmunt
107
192
0
14 Apr 2021
Human-Centered Artificial Intelligence: Reliable, Safe & Trustworthy
Human-Centered Artificial Intelligence: Reliable, Safe & Trustworthy
B. Shneiderman
28
690
0
10 Feb 2020
Know What You Don't Know: Unanswerable Questions for SQuAD
Know What You Don't Know: Unanswerable Questions for SQuAD
Pranav Rajpurkar
Robin Jia
Percy Liang
RALM
ELM
149
2,818
0
11 Jun 2018
1