Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2410.01691
Cited By
FactAlign: Long-form Factuality Alignment of Large Language Models
2 October 2024
Chao-Wei Huang
Yun-Nung Chen
HILM
Re-assign community
ArXiv
PDF
HTML
Papers citing
"FactAlign: Long-form Factuality Alignment of Large Language Models"
2 / 2 papers shown
Title
Atomic Consistency Preference Optimization for Long-Form Question Answering
Jingfeng Chen
Raghuveer Thirukovalluru
Junlin Wang
Kaiwei Luo
Bhuwan Dhingra
KELM
HILM
20
0
0
14 May 2025
The FACTS Grounding Leaderboard: Benchmarking LLMs' Ability to Ground Responses to Long-Form Input
Alon Jacovi
Andrew Wang
Chris Alberti
Connie Tao
Jon Lipovetz
...
Rachana Fellinger
Rui Wang
Zizhao Zhang
Sasha Goldshtein
Dipanjan Das
HILM
ALM
93
13
0
06 Jan 2025
1