ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2409.11353
  4. Cited By
THaMES: An End-to-End Tool for Hallucination Mitigation and Evaluation
  in Large Language Models

THaMES: An End-to-End Tool for Hallucination Mitigation and Evaluation in Large Language Models

17 September 2024
Mengfei Liang
Archish Arun
Zekun Wu
Cristian Muñoz
Jonathan Lutch
Emre Kazim
Adriano Soares Koshiyama
Philip C. Treleaven
    HILM
ArXivPDFHTML

Papers citing "THaMES: An End-to-End Tool for Hallucination Mitigation and Evaluation in Large Language Models"

1 / 1 papers shown
Title
The Factual Inconsistency Problem in Abstractive Text Summarization: A
  Survey
The Factual Inconsistency Problem in Abstractive Text Summarization: A Survey
Yi-Chong Huang
Xiachong Feng
Xiaocheng Feng
Bing Qin
HILM
136
105
0
30 Apr 2021
1