Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2310.02572
Cited By
Improving Knowledge Distillation with Teacher's Explanation
4 October 2023
S. Chowdhury
Ben Liang
A. Tizghadam
Ilijc Albanese
FAtt
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Improving Knowledge Distillation with Teacher's Explanation"
2 / 2 papers shown
Title
Distilling Knowledge via Knowledge Review
Pengguang Chen
Shu-Lin Liu
Hengshuang Zhao
Jiaya Jia
149
420
0
19 Apr 2021
Knowledge Distillation by On-the-Fly Native Ensemble
Xu Lan
Xiatian Zhu
S. Gong
192
473
0
12 Jun 2018
1