ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1912.10850
  4. Cited By
The State of Knowledge Distillation for Classification

The State of Knowledge Distillation for Classification

20 December 2019
Fabian Ruffy
K. Chahal
ArXivPDFHTML

Papers citing "The State of Knowledge Distillation for Classification"

5 / 5 papers shown
Title
Mutual Distillation Learning Network for Trajectory-User Linking
Mutual Distillation Learning Network for Trajectory-User Linking
Wei Chen
Shuzhe Li
Chao Huang
Yanwei Yu
Yongguo Jiang
Junyu Dong
18
29
0
08 May 2022
Self-Distillation from the Last Mini-Batch for Consistency
  Regularization
Self-Distillation from the Last Mini-Batch for Consistency Regularization
Yiqing Shen
Liwu Xu
Yuzhe Yang
Yaqian Li
Yandong Guo
15
60
0
30 Mar 2022
MRI-based Alzheimer's disease prediction via distilling the knowledge in
  multi-modal data
MRI-based Alzheimer's disease prediction via distilling the knowledge in multi-modal data
Hao Guan
Chaoyue Wang
Dacheng Tao
18
30
0
08 Apr 2021
Distilling and Transferring Knowledge via cGAN-generated Samples for
  Image Classification and Regression
Distilling and Transferring Knowledge via cGAN-generated Samples for Image Classification and Regression
Xin Ding
Z. J. Wang
Zuheng Xu
Z. Jane Wang
William J. Welch
36
22
0
07 Apr 2021
Pruning and Quantization for Deep Neural Network Acceleration: A Survey
Pruning and Quantization for Deep Neural Network Acceleration: A Survey
Tailin Liang
C. Glossner
Lei Wang
Shaobo Shi
Xiaotong Zhang
MQ
150
675
0
24 Jan 2021
1