Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
1912.10850
Cited By
The State of Knowledge Distillation for Classification
20 December 2019
Fabian Ruffy
K. Chahal
Re-assign community
ArXiv
PDF
HTML
Papers citing
"The State of Knowledge Distillation for Classification"
5 / 5 papers shown
Title
Mutual Distillation Learning Network for Trajectory-User Linking
Wei Chen
Shuzhe Li
Chao Huang
Yanwei Yu
Yongguo Jiang
Junyu Dong
18
29
0
08 May 2022
Self-Distillation from the Last Mini-Batch for Consistency Regularization
Yiqing Shen
Liwu Xu
Yuzhe Yang
Yaqian Li
Yandong Guo
15
60
0
30 Mar 2022
MRI-based Alzheimer's disease prediction via distilling the knowledge in multi-modal data
Hao Guan
Chaoyue Wang
Dacheng Tao
18
30
0
08 Apr 2021
Distilling and Transferring Knowledge via cGAN-generated Samples for Image Classification and Regression
Xin Ding
Z. J. Wang
Zuheng Xu
Z. Jane Wang
William J. Welch
36
22
0
07 Apr 2021
Pruning and Quantization for Deep Neural Network Acceleration: A Survey
Tailin Liang
C. Glossner
Lei Wang
Shaobo Shi
Xiaotong Zhang
MQ
150
675
0
24 Jan 2021
1