Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2007.01922
Cited By
Knowledge Distillation Beyond Model Compression
3 July 2020
F. Sarfraz
Elahe Arani
Bahram Zonooz
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Knowledge Distillation Beyond Model Compression"
5 / 5 papers shown
Title
Rotation Invariant Quantization for Model Compression
Dor-Joseph Kampeas
Yury Nahshan
Hanoch Kremer
Gil Lederman
Shira Zaloshinski
Zheng Li
E. Haleva
MQ
23
0
0
03 Mar 2023
LILA-BOTI : Leveraging Isolated Letter Accumulations By Ordering Teacher Insights for Bangla Handwriting Recognition
Md. Ismail Hossain
Mohammed Rakib
Sabbir Mollah
Fuad Rahman
Nabeel Mohammed
29
6
0
23 May 2022
Self-Distillation from the Last Mini-Batch for Consistency Regularization
Yiqing Shen
Liwu Xu
Yuzhe Yang
Yaqian Li
Yandong Guo
15
62
0
30 Mar 2022
Distill on the Go: Online knowledge distillation in self-supervised learning
Prashant Shivaram Bhat
Elahe Arani
Bahram Zonooz
SSL
22
28
0
20 Apr 2021
Knowledge Distillation by On-the-Fly Native Ensemble
Xu Lan
Xiatian Zhu
S. Gong
209
473
0
12 Jun 2018
1