Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2308.04268
Cited By
Teacher-Student Architecture for Knowledge Distillation: A Survey
8 August 2023
Chengming Hu
Xuan Li
Danyang Liu
Haolun Wu
Xi Chen
Ju Wang
Xue Liu
Re-assign community
ArXiv (abs)
PDF
HTML
Papers citing
"Teacher-Student Architecture for Knowledge Distillation: A Survey"
50 / 142 papers shown
Title
Learning Critically: Selective Self Distillation in Federated Learning on Non-IID Data
Yuting He
Yiqiang Chen
Xiaodong Yang
H. Yu
Yi-Hua Huang
Yang Gu
FedML
170
22
0
20 Apr 2025
Unbiased Knowledge Distillation for Recommendation
Gang Chen
Jiawei Chen
Fuli Feng
Sheng Zhou
Xiangnan He
76
30
0
27 Nov 2022
FreeKD: Free-direction Knowledge Distillation for Graph Neural Networks
Kaituo Feng
Changsheng Li
Ye Yuan
Guoren Wang
71
34
0
14 Jun 2022
Knowledge Distillation from A Stronger Teacher
Tao Huang
Shan You
Fei Wang
Chao Qian
Chang Xu
84
251
0
21 May 2022
Cross-Image Relational Knowledge Distillation for Semantic Segmentation
Chuanguang Yang
Helong Zhou
Zhulin An
Xue Jiang
Yong Xu
Qian Zhang
119
177
0
14 Apr 2022
Fine-tuning Global Model via Data-Free Knowledge Distillation for Non-IID Federated Learning
Lin Zhang
Li Shen
Liang Ding
Dacheng Tao
Ling-Yu Duan
FedML
83
267
0
17 Mar 2022
Decoupled Knowledge Distillation
Borui Zhao
Quan Cui
Renjie Song
Yiyu Qiu
Jiajun Liang
76
548
0
16 Mar 2022
Cross-Task Knowledge Distillation in Multi-Task Recommendation
Chenxiao Yang
Junwei Pan
Xiaofeng Gao
Tingyu Jiang
Dapeng Liu
Guihai Chen
101
46
0
20 Feb 2022
Information Theoretic Representation Distillation
Roy Miles
Adrian Lopez-Rodriguez
K. Mikolajczyk
MQ
78
22
0
01 Dec 2021
On Representation Knowledge Distillation for Graph Neural Networks
Chaitanya K. Joshi
Fayao Liu
Xu Xun
Jie Lin
Chuan-Sheng Foo
75
62
0
09 Nov 2021
Estimating and Maximizing Mutual Information for Knowledge Distillation
A. Shrivastava
Yanjun Qi
Vicente Ordonez
47
6
0
29 Oct 2021
FedGEMS: Federated Learning of Larger Server Models via Selective Knowledge Fusion
Sijie Cheng
Jingwen Wu
Yanghua Xiao
Yang Liu
Yang Liu
FedML
46
68
0
21 Oct 2021
Multi-Task Self-Training for Learning General Representations
Golnaz Ghiasi
Barret Zoph
E. D. Cubuk
Quoc V. Le
Nayeon Lee
SSL
85
101
0
25 Aug 2021
Distilling Holistic Knowledge with Graph Neural Networks
Sheng Zhou
Yucheng Wang
Defang Chen
Jiawei Chen
Xin Eric Wang
Can Wang
Jiajun Bu
62
57
0
12 Aug 2021
Online Knowledge Distillation for Efficient Pose Estimation
Zheng Li
Jingwen Ye
Xiuming Zhang
Ying Huang
Zhigeng Pan
52
95
0
04 Aug 2021
Local-Global Knowledge Distillation in Heterogeneous Federated Learning with Non-IID Data
Dezhong Yao
Wanning Pan
Yutong Dai
Yao Wan
Xiaofeng Ding
Hai Jin
Zheng Xu
Lichao Sun
FedML
119
50
0
30 Jun 2021
Topology Distillation for Recommender System
SeongKu Kang
Junyoung Hwang
Wonbin Kweon
Hwanjo Yu
55
43
0
16 Jun 2021
Does Knowledge Distillation Really Work?
Samuel Stanton
Pavel Izmailov
Polina Kirichenko
Alexander A. Alemi
A. Wilson
FedML
69
222
0
10 Jun 2021
Knowledge distillation: A good teacher is patient and consistent
Lucas Beyer
Xiaohua Zhai
Amelie Royer
L. Markeeva
Rohan Anil
Alexander Kolesnikov
VLM
107
297
0
09 Jun 2021
BERT Learns to Teach: Knowledge Distillation with Meta Learning
Wangchunshu Zhou
Canwen Xu
Julian McAuley
89
87
0
08 Jun 2021
Multi-Target Domain Adaptation with Collaborative Consistency Learning
Takashi Isobe
Xu Jia
Shuaijun Chen
Jianzhong He
Yongjie Shi
Jian-zhuo Liu
Huchuan Lu
Shengjin Wang
83
87
0
07 Jun 2021
Preservation of the Global Knowledge by Not-True Distillation in Federated Learning
Gihun Lee
Minchan Jeong
Yongjin Shin
Sangmin Bae
Se-Young Yun
FedML
88
122
0
06 Jun 2021
Graph-Free Knowledge Distillation for Graph Neural Networks
Xiang Deng
Zhongfei Zhang
65
70
0
16 May 2021
When Human Pose Estimation Meets Robustness: Adversarial Algorithms and Benchmarks
Jiahang Wang
Sheng Jin
Wentao Liu
Weizhong Liu
Chao Qian
Ping Luo
AAML
66
58
0
13 May 2021
Farewell to Mutual Information: Variational Distillation for Cross-Modal Person Re-Identification
Xudong Tian
Zhizhong Zhang
Shaohui Lin
Yanyun Qu
Yuan Xie
Lizhuang Ma
58
113
0
07 Apr 2021
Complementary Relation Contrastive Distillation
Jinguo Zhu
Shixiang Tang
Dapeng Chen
Shijie Yu
Yakun Liu
A. Yang
M. Rong
Xiaohua Wang
64
80
0
29 Mar 2021
Multimodal Knowledge Expansion
Zihui Xue
Sucheng Ren
Zhengqi Gao
Hang Zhao
134
30
0
26 Mar 2021
Adaptive Multi-Teacher Multi-level Knowledge Distillation
Yuang Liu
Wei Zhang
Jun Wang
70
157
0
06 Mar 2021
Extract the Knowledge of Graph Neural Networks and Go Beyond it: An Effective Knowledge Distillation Framework
Cheng Yang
Jiawei Liu
C. Shi
65
128
0
04 Mar 2021
Localization Distillation for Dense Object Detection
Zhaohui Zheng
Rongguang Ye
Ping Wang
Dongwei Ren
W. Zuo
Qibin Hou
Ming-Ming Cheng
ObjD
169
118
0
24 Feb 2021
Learning Student-Friendly Teacher Networks for Knowledge Distillation
D. Park
Moonsu Cha
C. Jeong
Daesin Kim
Bohyung Han
163
101
0
12 Feb 2021
FedAUX: Leveraging Unlabeled Auxiliary Data in Federated Learning
Felix Sattler
Tim Korjakow
R. Rischke
Wojciech Samek
FedML
52
117
0
04 Feb 2021
Reinforced Multi-Teacher Selection for Knowledge Distillation
Fei Yuan
Linjun Shou
J. Pei
Wutao Lin
Ming Gong
Yan Fu
Daxin Jiang
56
122
0
11 Dec 2020
DE-RRD: A Knowledge Distillation Framework for Recommender System
SeongKu Kang
Junyoung Hwang
Wonbin Kweon
Hwanjo Yu
51
82
0
08 Dec 2020
Ensemble Knowledge Distillation for CTR Prediction
Jieming Zhu
Jinyang Liu
Weiqi Li
Jincai Lai
Xiuqiang He
Liang Chen
Zibin Zheng
121
56
0
08 Nov 2020
FedBE: Making Bayesian Model Ensemble Applicable to Federated Learning
Hong-You Chen
Wei-Lun Chao
FedML
74
262
0
04 Sep 2020
Distillation-Based Semi-Supervised Federated Learning for Communication-Efficient Collaborative Training with Non-IID Private Data
Sohei Itahara
Takayuki Nishio
Yusuke Koda
M. Morikura
Koji Yamamoto
FedML
49
261
0
14 Aug 2020
Knowledge Distillation in Deep Learning and its Applications
Abdolmaged Alkhulaifi
Fahad Alsahli
Irfan Ahmad
FedML
47
79
0
17 Jul 2020
Knowledge Distillation for Multi-task Learning
Weihong Li
Hakan Bilen
MoMe
54
73
0
14 Jul 2020
Self-supervised Learning: Generative or Contrastive
Xiao Liu
Fanjin Zhang
Zhenyu Hou
Zhaoyu Wang
Li Mian
Jing Zhang
Jie Tang
SSL
143
1,630
0
15 Jun 2020
Ensemble Distillation for Robust Model Fusion in Federated Learning
Tao R. Lin
Lingjing Kong
Sebastian U. Stich
Martin Jaggi
FedML
104
1,048
0
12 Jun 2020
Knowledge Distillation Meets Self-Supervision
Guodong Xu
Ziwei Liu
Xiaoxiao Li
Chen Change Loy
FedML
82
285
0
12 Jun 2020
Knowledge Distillation: A Survey
Jianping Gou
B. Yu
Stephen J. Maybank
Dacheng Tao
VLM
144
2,982
0
09 Jun 2020
Peer Collaborative Learning for Online Knowledge Distillation
Guile Wu
S. Gong
FedML
48
129
0
07 Jun 2020
Multi-view Contrastive Learning for Online Knowledge Distillation
Chuanguang Yang
Zhulin An
Yongjun Xu
60
23
0
07 Jun 2020
Joint Progressive Knowledge Distillation and Unsupervised Domain Adaptation
Le Thanh Nguyen-Meidine
Eric Granger
M. Kiran
Jose Dolz
Louis-Antoine Blais-Morin
51
23
0
16 May 2020
Deep Learning for Wireless Communications
T. Erpek
Tim O'Shea
Y. Sagduyu
Yi Shi
T. Clancy
89
138
0
12 May 2020
A Simple Semi-Supervised Learning Framework for Object Detection
Kihyuk Sohn
Zizhao Zhang
Chun-Liang Li
Han Zhang
Chen-Yu Lee
Tomas Pfister
94
499
0
10 May 2020
Heterogeneous Knowledge Distillation using Information Flow Modeling
Nikolaos Passalis
Maria Tzelepi
Anastasios Tefas
73
139
0
02 May 2020
Inter-Region Affinity Distillation for Road Marking Segmentation
Yuenan Hou
Zheng Ma
Chunxiao Liu
Tak-Wai Hui
Chen Change Loy
68
124
0
11 Apr 2020
1
2
3
Next