ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2305.12057
  4. Cited By
Accurate Knowledge Distillation with n-best Reranking

Accurate Knowledge Distillation with n-best Reranking

20 May 2023
Hendra Setiawan
ArXivPDFHTML

Papers citing "Accurate Knowledge Distillation with n-best Reranking"

4 / 4 papers shown
Title
KETCHUP: K-Step Return Estimation for Sequential Knowledge Distillation
KETCHUP: K-Step Return Estimation for Sequential Knowledge Distillation
Jiabin Fan
Guoqing Luo
Michael Bowling
Lili Mou
OffRL
68
0
0
26 Apr 2025
Just KIDDIN: Knowledge Infusion and Distillation for Detection of INdecent Memes
Just KIDDIN: Knowledge Infusion and Distillation for Detection of INdecent Memes
Rahul Garg
Trilok Padhi
Hemang Jain
Ugur Kursuncu
Ponnurangam Kumaraguru
83
3
0
19 Nov 2024
MBR and QE Finetuning: Training-time Distillation of the Best and Most
  Expensive Decoding Methods
MBR and QE Finetuning: Training-time Distillation of the Best and Most Expensive Decoding Methods
M. Finkelstein
Subhajit Naskar
Mehdi Mirzazadeh
Apurva Shah
Markus Freitag
47
26
0
19 Sep 2023
Facebook AI WMT21 News Translation Task Submission
Facebook AI WMT21 News Translation Task Submission
C. Tran
Shruti Bhosale
James Cross
Philipp Koehn
Sergey Edunov
Angela Fan
VLM
134
81
0
06 Aug 2021
1