Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2006.03810
Cited By
An Empirical Analysis of the Impact of Data Augmentation on Knowledge Distillation
6 June 2020
Deepan Das
Haley Massa
Abhimanyu Kulkarni
Theodoros Rekatsinas
Re-assign community
ArXiv
PDF
HTML
Papers citing
"An Empirical Analysis of the Impact of Data Augmentation on Knowledge Distillation"
5 / 5 papers shown
Title
Robustness-Reinforced Knowledge Distillation with Correlation Distance and Network Pruning
Seonghak Kim
Gyeongdo Ham
Yucheol Cho
Daeshik Kim
30
3
0
23 Nov 2023
Distilling Calibrated Student from an Uncalibrated Teacher
Ishan Mishra
Sethu Vamsi Krishna
Deepak Mishra
FedML
40
2
0
22 Feb 2023
Curriculum Temperature for Knowledge Distillation
Zheng Li
Xiang Li
Lingfeng Yang
Borui Zhao
Renjie Song
Lei Luo
Jun Yu Li
Jian Yang
33
133
0
29 Nov 2022
Meta Knowledge Distillation
Jihao Liu
Boxiao Liu
Hongsheng Li
Yu Liu
18
25
0
16 Feb 2022
Isotonic Data Augmentation for Knowledge Distillation
Wanyun Cui
Sen Yan
29
7
0
03 Jul 2021
1