Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2409.12586
Cited By
Efficient Knowledge Distillation: Empowering Small Language Models with Teacher Model Insights
19 September 2024
Mohamad Ballout
U. Krumnack
Gunther Heidemann
Kai-Uwe Kühnberger
Re-assign community
ArXiv (abs)
PDF
HTML
Papers citing
"Efficient Knowledge Distillation: Empowering Small Language Models with Teacher Model Insights"
1 / 1 papers shown
Title
Token-Importance Guided Direct Preference Optimization
Yang Ning
Lin Hai
Liu Yibo
Tian Baoliang
Liu Guoqing
Zhang Haijun
71
0
0
26 May 2025
1