Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2305.12057
Cited By
Accurate Knowledge Distillation with n-best Reranking
20 May 2023
Hendra Setiawan
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Accurate Knowledge Distillation with n-best Reranking"
4 / 4 papers shown
Title
KETCHUP: K-Step Return Estimation for Sequential Knowledge Distillation
Jiabin Fan
Guoqing Luo
Michael Bowling
Lili Mou
OffRL
68
0
0
26 Apr 2025
Just KIDDIN: Knowledge Infusion and Distillation for Detection of INdecent Memes
Rahul Garg
Trilok Padhi
Hemang Jain
Ugur Kursuncu
Ponnurangam Kumaraguru
80
3
0
19 Nov 2024
MBR and QE Finetuning: Training-time Distillation of the Best and Most Expensive Decoding Methods
M. Finkelstein
Subhajit Naskar
Mehdi Mirzazadeh
Apurva Shah
Markus Freitag
47
26
0
19 Sep 2023
Facebook AI WMT21 News Translation Task Submission
C. Tran
Shruti Bhosale
James Cross
Philipp Koehn
Sergey Edunov
Angela Fan
VLM
134
81
0
06 Aug 2021
1