Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2409.02976
Cited By
Hallucination Detection in LLMs: Fast and Memory-Efficient Finetuned Models
4 September 2024
Gabriel Y. Arteaga
Thomas B. Schon
Nicolas Pielawski
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Hallucination Detection in LLMs: Fast and Memory-Efficient Finetuned Models"
3 / 3 papers shown
Title
Uncertainty Distillation: Teaching Language Models to Express Semantic Confidence
Sophia Hager
David Mueller
Kevin Duh
Nicholas Andrews
76
0
0
18 Mar 2025
Uncertainty quantification in fine-tuned LLMs using LoRA ensembles
Oleksandr Balabanov
Hampus Linander
UQCV
60
15
0
19 Feb 2024
Simple and Scalable Predictive Uncertainty Estimation using Deep Ensembles
Balaji Lakshminarayanan
Alexander Pritzel
Charles Blundell
UQCV
BDL
295
5,726
0
05 Dec 2016
1