Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2104.09044
Cited By
Distilling Knowledge via Knowledge Review
19 April 2021
Pengguang Chen
Shu Liu
Hengshuang Zhao
Jiaya Jia
Re-assign community
ArXiv (abs)
PDF
HTML
Github (272★)
Papers citing
"Distilling Knowledge via Knowledge Review"
15 / 215 papers shown
Title
Knowledge Distillation with the Reused Teacher Classifier
Defang Chen
Jianhan Mei
Hailin Zhang
C. Wang
Yan Feng
Chun-Yen Chen
99
172
0
26 Mar 2022
Decoupled Knowledge Distillation
Borui Zhao
Quan Cui
Renjie Song
Yiyu Qiu
Jiajun Liang
96
555
0
16 Mar 2022
TransKD: Transformer Knowledge Distillation for Efficient Semantic Segmentation
R. Liu
Kailun Yang
Alina Roitberg
Jiaming Zhang
Kunyu Peng
Huayao Liu
Yaonan Wang
Rainer Stiefelhagen
ViT
91
38
0
27 Feb 2022
MonoDistill: Learning Spatial Features for Monocular 3D Object Detection
Zhiyu Chong
Xinzhu Ma
Hong Zhang
Yuxin Yue
Haojie Li
Zhihui Wang
Wanli Ouyang
3DPC
177
102
0
26 Jan 2022
Anomaly Detection via Reverse Distillation from One-Class Embedding
Hanqiu Deng
Xingyu Li
UQCV
204
490
0
26 Jan 2022
Pixel Distillation: A New Knowledge Distillation Scheme for Low-Resolution Image Recognition
Guangyu Guo
Dingwen Zhang
Longfei Han
Nian Liu
Ming-Ming Cheng
Junwei Han
61
2
0
17 Dec 2021
Information Theoretic Representation Distillation
Roy Miles
Adrian Lopez-Rodriguez
K. Mikolajczyk
MQ
103
22
0
01 Dec 2021
Oracle Teacher: Leveraging Target Information for Better Knowledge Distillation of CTC Models
J. Yoon
H. Kim
Hyeon Seung Lee
Sunghwan Ahn
N. Kim
62
1
0
05 Nov 2021
Response-based Distillation for Incremental Object Detection
Tao Feng
Mang Wang
ObjD
CLL
110
1
0
26 Oct 2021
Distilling and Transferring Knowledge via cGAN-generated Samples for Image Classification and Regression
Xin Ding
Z. J. Wang
Zuheng Xu
Z. Jane Wang
William J. Welch
133
22
0
07 Apr 2021
Distilling a Powerful Student Model via Online Knowledge Distillation
Shaojie Li
Mingbao Lin
Yan Wang
Yongjian Wu
Yonghong Tian
Ling Shao
Rongrong Ji
FedML
117
47
0
26 Mar 2021
PURSUhInT: In Search of Informative Hint Points Based on Layer Clustering for Knowledge Distillation
Reyhan Kevser Keser
Aydin Ayanzadeh
O. A. Aghdam
Çaglar Kilcioglu
B. U. Toreyin
N. K. Üre
53
7
0
26 Feb 2021
Learnable Boundary Guided Adversarial Training
Jiequan Cui
Shu Liu
Liwei Wang
Jiaya Jia
OOD
AAML
113
132
0
23 Nov 2020
Adjoined Networks: A Training Paradigm with Applications to Network Compression
Utkarsh Nath
Shrinu Kushagra
Yingzhen Yang
53
2
0
10 Jun 2020
Contrastive Representation Distillation
Yonglong Tian
Dilip Krishnan
Phillip Isola
211
1,057
0
23 Oct 2019
Previous
1
2
3
4
5