Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
1904.01866
Cited By
A Comprehensive Overhaul of Feature Distillation
3 April 2019
Byeongho Heo
Jeesoo Kim
Sangdoo Yun
Hyojin Park
Nojun Kwak
J. Choi
Re-assign community
ArXiv
PDF
HTML
Papers citing
"A Comprehensive Overhaul of Feature Distillation"
25 / 125 papers shown
Title
Students are the Best Teacher: Exit-Ensemble Distillation with Multi-Exits
Hojung Lee
Jong-Seok Lee
22
8
0
01 Apr 2021
Distilling Object Detectors via Decoupled Features
Jianyuan Guo
Kai Han
Yunhe Wang
Han Wu
Xinghao Chen
Chunjing Xu
Chang Xu
48
199
0
26 Mar 2021
Refine Myself by Teaching Myself: Feature Refinement via Self-Knowledge Distillation
Mingi Ji
Seungjae Shin
Seunghyun Hwang
Gibeom Park
Il-Chul Moon
13
120
0
15 Mar 2021
Re-labeling ImageNet: from Single to Multi-Labels, from Global to Localized Labels
Sangdoo Yun
Seong Joon Oh
Byeongho Heo
Dongyoon Han
Junsuk Choe
Sanghyuk Chun
414
143
0
13 Jan 2021
Computation-Efficient Knowledge Distillation via Uncertainty-Aware Mixup
Guodong Xu
Ziwei Liu
Chen Change Loy
UQCV
21
39
0
17 Dec 2020
Cross-Layer Distillation with Semantic Calibration
Defang Chen
Jian-Ping Mei
Yuan Zhang
Can Wang
Yan Feng
Chun-Yen Chen
FedML
45
288
0
06 Dec 2020
Federated Knowledge Distillation
Hyowoon Seo
Jihong Park
Seungeun Oh
M. Bennis
Seong-Lyun Kim
FedML
36
91
0
04 Nov 2020
Comprehensive Online Network Pruning via Learnable Scaling Factors
Muhammad Umair Haider
M. Taj
21
7
0
06 Oct 2020
Kernel Based Progressive Distillation for Adder Neural Networks
Yixing Xu
Chang Xu
Xinghao Chen
Wei Zhang
Chunjing Xu
Yunhe Wang
41
47
0
28 Sep 2020
Differentiable Feature Aggregation Search for Knowledge Distillation
Yushuo Guan
Pengyu Zhao
Bingxuan Wang
Yuanxing Zhang
Cong Yao
Kaigui Bian
Jian Tang
FedML
25
44
0
02 Aug 2020
Distilling Visual Priors from Self-Supervised Learning
Bingchen Zhao
Xin Wen
SSL
8
14
0
01 Aug 2020
Learning with Privileged Information for Efficient Image Super-Resolution
Wonkyung Lee
Junghyup Lee
Dohyung Kim
Bumsub Ham
33
134
0
15 Jul 2020
Unsupervised Multi-Target Domain Adaptation Through Knowledge Distillation
Le Thanh Nguyen-Meidine
Atif Bela
M. Kiran
Jose Dolz
Louis-Antoine Blais-Morin
Eric Granger
38
82
0
14 Jul 2020
Optical Flow Distillation: Towards Efficient and Stable Video Style Transfer
Xinghao Chen
Yiman Zhang
Yunhe Wang
Han Shu
Chunjing Xu
Chang Xu
VGen
21
54
0
10 Jul 2020
Dynamic Group Convolution for Accelerating Convolutional Neural Networks
Z. Su
Linpu Fang
Wenxiong Kang
D. Hu
M. Pietikäinen
Li Liu
15
44
0
08 Jul 2020
Knowledge Distillation: A Survey
Jianping Gou
B. Yu
Stephen J. Maybank
Dacheng Tao
VLM
28
2,857
0
09 Jun 2020
Self-Distillation as Instance-Specific Label Smoothing
Zhilu Zhang
M. Sabuncu
22
116
0
09 Jun 2020
Multi-view Contrastive Learning for Online Knowledge Distillation
Chuanguang Yang
Zhulin An
Yongjun Xu
24
23
0
07 Jun 2020
Teacher-Class Network: A Neural Network Compression Mechanism
Shaiq Munir Malik
Muhammad Umair Haider
Fnu Mohbat
Musab Rasheed
M. Taj
22
5
0
07 Apr 2020
Neural Networks Are More Productive Teachers Than Human Raters: Active Mixup for Data-Efficient Knowledge Distillation from a Blackbox Model
Dongdong Wang
Yandong Li
Liqiang Wang
Boqing Gong
29
48
0
31 Mar 2020
Knowledge distillation via adaptive instance normalization
Jing Yang
Brais Martínez
Adrian Bulat
Georgios Tzimiropoulos
21
23
0
09 Mar 2020
QKD: Quantization-aware Knowledge Distillation
Jangho Kim
Yash Bhalgat
Jinwon Lee
Chirag I. Patel
Nojun Kwak
MQ
26
64
0
28 Nov 2019
Search to Distill: Pearls are Everywhere but not the Eyes
Yu Liu
Xuhui Jia
Mingxing Tan
Raviteja Vemulapalli
Yukun Zhu
Bradley Green
Xiaogang Wang
30
68
0
20 Nov 2019
Preparing Lessons: Improve Knowledge Distillation with Better Supervision
Tiancheng Wen
Shenqi Lai
Xueming Qian
25
68
0
18 Nov 2019
Knowledge Transfer Graph for Deep Collaborative Learning
Soma Minami
Tsubasa Hirakawa
Takayoshi Yamashita
H. Fujiyoshi
30
9
0
10 Sep 2019
Previous
1
2
3