Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2006.05525
Cited By
v1
v2
v3
v4
v5
v6
v7 (latest)
Knowledge Distillation: A Survey
9 June 2020
Jianping Gou
B. Yu
Stephen J. Maybank
Dacheng Tao
VLM
Re-assign community
ArXiv (abs)
PDF
HTML
Papers citing
"Knowledge Distillation: A Survey"
50 / 328 papers shown
Title
Improving Multi-Task Deep Neural Networks via Knowledge Distillation for Natural Language Understanding
Xiaodong Liu
Pengcheng He
Weizhu Chen
Jianfeng Gao
FedML
62
182
0
20 Apr 2019
Knowledge Distillation via Route Constrained Optimization
Xiao Jin
Baoyun Peng
Yichao Wu
Yu Liu
Jiaheng Liu
Ding Liang
Junjie Yan
Xiaolin Hu
79
172
0
19 Apr 2019
Feature Fusion for Online Mutual Knowledge Distillation
Jangho Kim
Minsung Hyun
Inseop Chung
Nojun Kwak
FedML
65
91
0
19 Apr 2019
Audio-Visual Model Distillation Using Acoustic Images
Andrés F. Pérez
Valentina Sanguineti
Pietro Morerio
Vittorio Murino
VLM
56
27
0
16 Apr 2019
Unifying Heterogeneous Classifiers with Distillation
J. Vongkulbhisal
Phongtharin Vinayavekhin
M. V. Scarzanella
50
51
0
12 Apr 2019
Knowledge Flow: Improve Upon Your Teachers
Iou-Jen Liu
Jian-wei Peng
Alex Schwing
93
62
0
11 Apr 2019
Variational Information Distillation for Knowledge Transfer
SungSoo Ahn
S. Hu
Andreas C. Damianou
Neil D. Lawrence
Zhenwen Dai
89
621
0
11 Apr 2019
Knowledge Squeezed Adversarial Network Compression
Changyong Shu
Li Peng
Xie Yuan
Yanyun Qu
Longquan Dai
Lizhuang Ma
GAN
60
11
0
10 Apr 2019
Relational Knowledge Distillation
Wonpyo Park
Dongju Kim
Yan Lu
Minsu Cho
74
1,423
0
10 Apr 2019
Knowledge Distillation For Recurrent Neural Network Language Modeling With Trust Regularization
Yangyang Shi
M. Hwang
X. Lei
Haoyu Sheng
131
25
0
08 Apr 2019
Learning Metrics from Teachers: Compact Networks for Image Embedding
Lu Yu
V. O. Yazici
Xialei Liu
Joost van de Weijer
Yongmei Cheng
Arnau Ramisa
60
109
0
07 Apr 2019
Semantic-Aware Knowledge Preservation for Zero-Shot Sketch-Based Image Retrieval
Qing Liu
Lingxi Xie
Huiyu Wang
Alan Yuille
VLM
52
109
0
05 Apr 2019
White-to-Black: Efficient Distillation of Black-Box Adversarial Attacks
Yotam Gil
Yoav Chai
O. Gorodissky
Jonathan Berant
MLAU
AAML
48
46
0
04 Apr 2019
A Comprehensive Overhaul of Feature Distillation
Byeongho Heo
Jeesoo Kim
Sangdoo Yun
Hyojin Park
Nojun Kwak
J. Choi
86
584
0
03 Apr 2019
Correlation Congruence for Knowledge Distillation
Baoyun Peng
Xiao Jin
Jiaheng Liu
Shunfeng Zhou
Yichao Wu
Yu Liu
Dongsheng Li
Zhaoning Zhang
94
513
0
03 Apr 2019
M2KD: Multi-model and Multi-level Knowledge Distillation for Incremental Learning
Peng Zhou
Long Mai
Jianming Zhang
N. Xu
Zuxuan Wu
L. Davis
CLL
VLM
60
55
0
03 Apr 2019
Why ResNet Works? Residuals Generalize
Fengxiang He
Tongliang Liu
Dacheng Tao
55
251
0
02 Apr 2019
Data-Free Learning of Student Networks
Hanting Chen
Yunhe Wang
Chang Xu
Zhaohui Yang
Chuanjian Liu
Boxin Shi
Chunjing Xu
Chao Xu
Qi Tian
FedML
53
372
0
02 Apr 2019
Large Batch Optimization for Deep Learning: Training BERT in 76 minutes
Yang You
Jing Li
Sashank J. Reddi
Jonathan Hseu
Sanjiv Kumar
Srinadh Bhojanapalli
Xiaodan Song
J. Demmel
Kurt Keutzer
Cho-Jui Hsieh
ODL
265
999
0
01 Apr 2019
Overcoming Catastrophic Forgetting with Unlabeled Data in the Wild
Kibok Lee
Kimin Lee
Jinwoo Shin
Honglak Lee
CLL
132
206
0
29 Mar 2019
Distilling Task-Specific Knowledge from BERT into Simple Neural Networks
Raphael Tang
Yao Lu
Linqing Liu
Lili Mou
Olga Vechtomova
Jimmy J. Lin
75
421
0
28 Mar 2019
Improving Neural Architecture Search Image Classifiers via Ensemble Learning
Vladimir Macko
Charles Weill
Hanna Mazzawi
J. Gonzalvo
41
17
0
14 Mar 2019
Knowledge Adaptation for Efficient Semantic Segmentation
Tong He
Chunhua Shen
Zhi Tian
Dong Gong
Changming Sun
Youliang Yan
SSeg
60
225
0
12 Mar 2019
Refine and Distill: Exploiting Cycle-Inconsistency and Knowledge Distillation for Unsupervised Monocular Depth Estimation
Andrea Pilzer
Stéphane Lathuilière
N. Sebe
Elisa Ricci
MDE
88
135
0
11 Mar 2019
Structured Knowledge Distillation for Dense Prediction
Yifan Liu
Chris Liu
Jingdong Wang
Zhenbo Luo
86
584
0
11 Mar 2019
Efficient Video Classification Using Fewer Frames
S. Bhardwaj
Mukundhan Srinivasan
Mitesh M. Khapra
74
88
0
27 Feb 2019
Multilingual Neural Machine Translation with Knowledge Distillation
Xu Tan
Yi Ren
Di He
Tao Qin
Zhou Zhao
Tie-Yan Liu
81
250
0
27 Feb 2019
DDFlow: Learning Optical Flow with Unlabeled Data Distillation
Pengpeng Liu
Irwin King
Michael R. Lyu
Jia Xu
27
173
0
25 Feb 2019
Optimizing Network Performance for Distributed DNN Training on GPU Clusters: ImageNet/AlexNet Training in 1.5 Minutes
Peng Sun
Wansen Feng
Ruobing Han
Shengen Yan
Yonggang Wen
AI4CE
72
70
0
19 Feb 2019
Improved Knowledge Distillation via Teacher Assistant
Seyed Iman Mirzadeh
Mehrdad Farajtabar
Ang Li
Nir Levine
Akihiro Matsukawa
H. Ghasemzadeh
100
1,081
0
09 Feb 2019
Compressing GANs using Knowledge Distillation
Angeline Aguinaldo
Ping Yeh-Chiang
Alex Gain
Ameya D. Patil
Kolten Pearson
Soheil Feizi
GAN
57
85
0
01 Feb 2019
Improving the Interpretability of Deep Neural Networks with Knowledge Distillation
Xuan Liu
Xiaoguang Wang
Stan Matwin
HAI
57
101
0
28 Dec 2018
Learning Student Networks via Feature Embedding
Hanting Chen
Yunhe Wang
Chang Xu
Chao Xu
Dacheng Tao
63
94
0
17 Dec 2018
Spatial Knowledge Distillation to aid Visual Reasoning
Somak Aditya
Rudra Saha
Yezhou Yang
Chitta Baral
59
15
0
10 Dec 2018
FBNet: Hardware-Aware Efficient ConvNet Design via Differentiable Neural Architecture Search
Bichen Wu
Xiaoliang Dai
Peizhao Zhang
Yanghan Wang
Fei Sun
Yiming Wu
Yuandong Tian
Peter Vajda
Yangqing Jia
Kurt Keutzer
MQ
100
1,306
0
09 Dec 2018
Online Model Distillation for Efficient Video Inference
Ravi Teja Mullapudi
Steven Chen
Keyi Zhang
Deva Ramanan
Kayvon Fatahalian
VGen
72
115
0
06 Dec 2018
MEAL: Multi-Model Ensemble via Adversarial Learning
Zhiqiang Shen
Zhankui He
Xiangyang Xue
AAML
FedML
65
147
0
06 Dec 2018
Teacher-Student Compression with Generative Adversarial Networks
Ruishan Liu
Nicolò Fusi
Lester W. Mackey
46
17
0
05 Dec 2018
Transferring Knowledge across Learning Processes
Sebastian Flennerhag
Pablo G. Moreno
Neil D. Lawrence
Andreas C. Damianou
43
64
0
03 Dec 2018
Knowledge Distillation with Feature Maps for Image Classification
Wei-Chun Chen
Chia-Che Chang
Chien-Yu Lu
Che-Rung Lee
53
36
0
03 Dec 2018
Snapshot Distillation: Teacher-Student Optimization in One Generation
Chenglin Yang
Lingxi Xie
Chi Su
Alan Yuille
81
193
0
01 Dec 2018
Dataset Distillation
Tongzhou Wang
Jun-Yan Zhu
Antonio Torralba
Alexei A. Efros
DD
81
297
0
27 Nov 2018
Low-resolution Face Recognition in the Wild via Selective Knowledge Distillation
Shiming Ge
Shengwei Zhao
Chenyu Li
Jia Li
CVBM
128
188
0
25 Nov 2018
Self-Referenced Deep Learning
Xu Lan
Xiatian Zhu
S. Gong
113
24
0
19 Nov 2018
Fast Human Pose Estimation
Feng Zhang
Xiatian Zhu
Mao Ye
3DH
74
238
0
13 Nov 2018
Private Model Compression via Knowledge Distillation
Ji Wang
Weidong Bao
Lichao Sun
Xiaomin Zhu
Bokai Cao
Philip S. Yu
FedML
71
119
0
13 Nov 2018
Learning and Generalization in Overparameterized Neural Networks, Going Beyond Two Layers
Zeyuan Allen-Zhu
Yuanzhi Li
Yingyu Liang
MLT
201
775
0
12 Nov 2018
Knowledge Transfer via Distillation of Activation Boundaries Formed by Hidden Neurons
Byeongho Heo
Minsik Lee
Sangdoo Yun
J. Choi
55
527
0
08 Nov 2018
Amalgamating Knowledge towards Comprehensive Classification
Chengchao Shen
L. Câlmâc
Mingli Song
Li Sun
Xiuming Zhang
MoMe
61
92
0
07 Nov 2018
Cogni-Net: Cognitive Feature Learning through Deep Visual Perception
Pranay Mukherjee
Abhirup Das
A. Bhunia
P. Roy
38
13
0
01 Nov 2018
Previous
1
2
3
4
5
6
7
Next