Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2104.09044
Cited By
Distilling Knowledge via Knowledge Review
19 April 2021
Pengguang Chen
Shu Liu
Hengshuang Zhao
Jiaya Jia
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Distilling Knowledge via Knowledge Review"
25 / 75 papers shown
Title
Rethinking Implicit Neural Representations for Vision Learners
Yiran Song
Qianyu Zhou
Lizhuang Ma
16
7
0
22 Nov 2022
D
3
^3
3
ETR: Decoder Distillation for Detection Transformer
Xiaokang Chen
Jiahui Chen
Yong Liu
Gang Zeng
42
16
0
17 Nov 2022
Pixel-Wise Contrastive Distillation
Junqiang Huang
Zichao Guo
42
4
0
01 Nov 2022
Improved Feature Distillation via Projector Ensemble
Yudong Chen
Sen Wang
Jiajun Liu
Xuwei Xu
Frank de Hoog
Zi Huang
37
37
0
27 Oct 2022
Respecting Transfer Gap in Knowledge Distillation
Yulei Niu
Long Chen
Chan Zhou
Hanwang Zhang
26
23
0
23 Oct 2022
Feature Reconstruction Attacks and Countermeasures of DNN training in Vertical Federated Learning
Peng Ye
Zhifeng Jiang
Wei Wang
Bo-wen Li
Baochun Li
AAML
FedML
42
15
0
13 Oct 2022
PROD: Progressive Distillation for Dense Retrieval
Zhenghao Lin
Yeyun Gong
Xiao Liu
Hang Zhang
Chen Lin
...
Jian Jiao
Jing Lu
Daxin Jiang
Rangan Majumder
Nan Duan
51
27
0
27 Sep 2022
Rethinking Knowledge Distillation via Cross-Entropy
Zhendong Yang
Zhe Li
Yuan Gong
Tianke Zhang
Shanshan Lao
Chun Yuan
Yu Li
30
14
0
22 Aug 2022
Boosting Single-Frame 3D Object Detection by Simulating Multi-Frame Point Clouds
Wu Zheng
Li Jiang
Fanbin Lu
Yangyang Ye
Chi-Wing Fu
3DPC
ObjD
36
9
0
03 Jul 2022
Boosting 3D Object Detection by Simulating Multimodality on Point Clouds
Wu Zheng
Ming-Hong Hong
Li Jiang
Chi-Wing Fu
3DPC
34
31
0
30 Jun 2022
Parameter-Efficient and Student-Friendly Knowledge Distillation
Jun Rao
Xv Meng
Liang Ding
Shuhan Qi
Dacheng Tao
37
46
0
28 May 2022
Knowledge Distillation via the Target-aware Transformer
Sihao Lin
Hongwei Xie
Bing Wang
Kaicheng Yu
Xiaojun Chang
Xiaodan Liang
G. Wang
ViT
20
104
0
22 May 2022
Knowledge Distillation from A Stronger Teacher
Tao Huang
Shan You
Fei Wang
Chao Qian
Chang Xu
19
235
0
21 May 2022
[Re] Distilling Knowledge via Knowledge Review
Apoorva Verma
Pranjal Gulati
Sarthak Gupta
VLM
16
0
0
18 May 2022
Masked Generative Distillation
Zhendong Yang
Zhe Li
Mingqi Shao
Dachuan Shi
Zehuan Yuan
Chun Yuan
FedML
32
168
0
03 May 2022
Overcoming Catastrophic Forgetting in Incremental Object Detection via Elastic Response Distillation
Tao Feng
Mang Wang
Hangjie Yuan
ObjD
CLL
26
84
0
05 Apr 2022
Knowledge Distillation with the Reused Teacher Classifier
Defang Chen
Jianhan Mei
Hailin Zhang
C. Wang
Yan Feng
Chun-Yen Chen
36
166
0
26 Mar 2022
MonoDistill: Learning Spatial Features for Monocular 3D Object Detection
Zhiyu Chong
Xinzhu Ma
Hong Zhang
Yuxin Yue
Haojie Li
Zhihui Wang
Wanli Ouyang
3DPC
101
98
0
26 Jan 2022
Anomaly Detection via Reverse Distillation from One-Class Embedding
Hanqiu Deng
Xingyu Li
UQCV
117
448
0
26 Jan 2022
Information Theoretic Representation Distillation
Roy Miles
Adrian Lopez-Rodriguez
K. Mikolajczyk
MQ
13
21
0
01 Dec 2021
Oracle Teacher: Leveraging Target Information for Better Knowledge Distillation of CTC Models
J. Yoon
H. Kim
Hyeon Seung Lee
Sunghwan Ahn
N. Kim
33
1
0
05 Nov 2021
Distilling and Transferring Knowledge via cGAN-generated Samples for Image Classification and Regression
Xin Ding
Z. J. Wang
Zuheng Xu
Z. Jane Wang
William J. Welch
33
22
0
07 Apr 2021
Distilling a Powerful Student Model via Online Knowledge Distillation
Shaojie Li
Mingbao Lin
Yan Wang
Yongjian Wu
Yonghong Tian
Ling Shao
Rongrong Ji
FedML
27
46
0
26 Mar 2021
Learnable Boundary Guided Adversarial Training
Jiequan Cui
Shu Liu
Liwei Wang
Jiaya Jia
OOD
AAML
24
124
0
23 Nov 2020
MobileNets: Efficient Convolutional Neural Networks for Mobile Vision Applications
Andrew G. Howard
Menglong Zhu
Bo Chen
Dmitry Kalenichenko
Weijun Wang
Tobias Weyand
M. Andreetto
Hartwig Adam
3DH
950
20,567
0
17 Apr 2017
Previous
1
2