Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2104.09044
Cited By
Distilling Knowledge via Knowledge Review
19 April 2021
Pengguang Chen
Shu Liu
Hengshuang Zhao
Jiaya Jia
Re-assign community
ArXiv (abs)
PDF
HTML
Github (272★)
Papers citing
"Distilling Knowledge via Knowledge Review"
50 / 215 papers shown
Title
Deep Collective Knowledge Distillation
Jihyeon Seo
Kyusam Oh
Chanho Min
Yongkeun Yun
Sungwoo Cho
29
0
0
18 Apr 2023
Towards Efficient Task-Driven Model Reprogramming with Foundation Models
Shoukai Xu
Jiangchao Yao
Ran Luo
Shuhai Zhang
Zihao Lian
Mingkui Tan
Bo Han
Yaowei Wang
95
6
0
05 Apr 2023
A Simple and Generic Framework for Feature Distillation via Channel-wise Transformation
Ziwei Liu
Yongtao Wang
Xiaojie Chu
75
6
0
23 Mar 2023
From Knowledge Distillation to Self-Knowledge Distillation: A Unified Approach with Normalized Loss and Customized Soft Labels
Zhendong Yang
Ailing Zeng
Zhe Li
Tianke Zhang
Chun Yuan
Yu Li
107
80
0
23 Mar 2023
Detecting the open-world objects with the help of the Brain
Shuailei Ma
Yuefeng Wang
Ying-yu Wei
Peihao Chen
Zhixiang Ye
Jiaqi Fan
Enming Zhang
Thomas H. Li
VLM
ObjD
66
3
0
21 Mar 2023
Towards a Smaller Student: Capacity Dynamic Distillation for Efficient Image Retrieval
Yi Xie
Huaidong Zhang
Xuemiao Xu
Jianqing Zhu
Shengfeng He
VLM
58
14
0
16 Mar 2023
Knowledge Distillation from Single to Multi Labels: an Empirical Study
Youcai Zhang
Yuzhuo Qin
Heng-Ye Liu
Yanhao Zhang
Yaqian Li
X. Gu
VLM
80
2
0
15 Mar 2023
MobileVOS: Real-Time Video Object Segmentation Contrastive Learning meets Knowledge Distillation
Roy Miles
M. K. Yucel
Bruno Manganelli
Albert Saà-Garriga
VOS
81
25
0
14 Mar 2023
Generic-to-Specific Distillation of Masked Autoencoders
Wei Huang
Zhiliang Peng
Li Dong
Furu Wei
Jianbin Jiao
QiXiang Ye
84
23
0
28 Feb 2023
Take a Prior from Other Tasks for Severe Blur Removal
Pei Wang
Danna Xue
Yu Zhu
Jinqiu Sun
Qingsen Yan
Sung-eui Yoon
Yanning Zhang
66
3
0
14 Feb 2023
Stitchable Neural Networks
Zizheng Pan
Jianfei Cai
Bohan Zhuang
91
25
0
13 Feb 2023
Knowledge Distillation in Vision Transformers: A Critical Review
Gousia Habib
Tausifa Jan Saleem
Brejesh Lall
96
16
0
04 Feb 2023
Adaptively Integrated Knowledge Distillation and Prediction Uncertainty for Continual Learning
Kanghao Chen
Sijia Liu
Ruixuan Wang
Weishi Zheng
KELM
CLL
29
0
0
18 Jan 2023
Dataset Distillation: A Comprehensive Review
Ruonan Yu
Songhua Liu
Xinchao Wang
DD
160
131
0
17 Jan 2023
StereoDistill: Pick the Cream from LiDAR for Distilling Stereo-based 3D Object Detection
Zhe Liu
Xiaoqing Ye
Xiao Tan
Errui Ding
Xiang Bai
3DPC
89
8
0
04 Jan 2023
Guided Hybrid Quantization for Object detection in Multimodal Remote Sensing Imagery via One-to-one Self-teaching
Jiaqing Zhang
Jie Lei
Weiying Xie
Yunsong Li
Wenxuan Wang
MQ
86
23
0
31 Dec 2022
Resolving Task Confusion in Dynamic Expansion Architectures for Class Incremental Learning
Bin Huang
Zhineng Chen
Peng Zhou
Jiayin Chen
Zuxuan Wu
CLL
70
22
0
29 Dec 2022
Discriminator-Cooperated Feature Map Distillation for GAN Compression
Tie Hu
Mingbao Lin
Lizhou You
Chia-Wen Lin
Rongrong Ji
79
9
0
29 Dec 2022
Exploring Content Relationships for Distilling Efficient GANs
Lizhou You
Mingbao Lin
Tie Hu
Chia-Wen Lin
Rongrong Ji
75
4
0
21 Dec 2022
Hint-dynamic Knowledge Distillation
Yiyang Liu
Chenxin Li
Xiaotong Tu
Xinghao Ding
Yue Huang
75
1
0
30 Nov 2022
Curriculum Temperature for Knowledge Distillation
Zheng Li
Xiang Li
Lingfeng Yang
Borui Zhao
Renjie Song
Lei Luo
Jun Yu Li
Jian Yang
81
146
0
29 Nov 2022
Rethinking Implicit Neural Representations for Vision Learners
Yiran Song
Qianyu Zhou
Lizhuang Ma
85
7
0
22 Nov 2022
D
3
^3
3
ETR: Decoder Distillation for Detection Transformer
Xiaokang Chen
Jiahui Chen
Yang Liu
Gang Zeng
78
16
0
17 Nov 2022
DETRDistill: A Universal Knowledge Distillation Framework for DETR-families
Jiahao Chang
Shuo Wang
Guangkai Xu
Zehui Chen
Chenhongyi Yang
Fengshang Zhao
102
29
0
17 Nov 2022
Pixel-Wise Contrastive Distillation
Junqiang Huang
Zichao Guo
130
4
0
01 Nov 2022
A pruning method based on the dissimilarity of angle among channels and filters
Jiayi Yao
P. Li
Xiatao Kang
Yuzhe Wang
3DPC
58
3
0
29 Oct 2022
Improved Feature Distillation via Projector Ensemble
Yudong Chen
Sen Wang
Jiajun Liu
Xuwei Xu
Frank de Hoog
Zi Huang
82
41
0
27 Oct 2022
Respecting Transfer Gap in Knowledge Distillation
Yulei Niu
Long Chen
Chan Zhou
Hanwang Zhang
99
25
0
23 Oct 2022
Feature Reconstruction Attacks and Countermeasures of DNN training in Vertical Federated Learning
Peng Ye
Zhifeng Jiang
Wei Wang
Yue Liu
Baochun Li
AAML
FedML
83
18
0
13 Oct 2022
Asymmetric Temperature Scaling Makes Larger Networks Teach Well Again
Xin-Chun Li
Wenxuan Fan
Shaoming Song
Yinchuan Li
Bingshuai Li
Yunfeng Shao
De-Chuan Zhan
120
31
0
10 Oct 2022
Attention Distillation: self-supervised vision transformer students need more guidance
Kai Wang
Fei Yang
Joost van de Weijer
ViT
57
18
0
03 Oct 2022
PROD: Progressive Distillation for Dense Retrieval
Zhenghao Lin
Yeyun Gong
Xiao Liu
Hang Zhang
Chen Lin
...
Jian Jiao
Jing Lu
Daxin Jiang
Rangan Majumder
Nan Duan
129
27
0
27 Sep 2022
Switchable Online Knowledge Distillation
Biao Qian
Yang Wang
Hongzhi Yin
Richang Hong
Meng Wang
98
39
0
12 Sep 2022
ViTKD: Practical Guidelines for ViT feature knowledge distillation
Zhendong Yang
Zhe Li
Ailing Zeng
Zexian Li
Chun Yuan
Yu Li
142
42
0
06 Sep 2022
Rethinking Knowledge Distillation via Cross-Entropy
Zhendong Yang
Zhe Li
Yuan Gong
Tianke Zhang
Shanshan Lao
Chun Yuan
Yu Li
71
14
0
22 Aug 2022
Difficulty-Aware Simulator for Open Set Recognition
WonJun Moon
Junho Park
Hyun Seok Seong
Cheol-Ho Cho
Jae-Pil Heo
71
32
0
20 Jul 2022
Knowledge Condensation Distillation
Chenxin Li
Mingbao Lin
Zhiyuan Ding
Nie Lin
Yihong Zhuang
Yue Huang
Xinghao Ding
Liujuan Cao
88
28
0
12 Jul 2022
Cross-Architecture Knowledge Distillation
Yufan Liu
Jiajiong Cao
Bing Li
Weiming Hu
Jin-Fei Ding
Liang Li
76
44
0
12 Jul 2022
Dynamic Contrastive Distillation for Image-Text Retrieval
Jun Rao
Liang Ding
Shuhan Qi
Meng Fang
Yang Liu
Liqiong Shen
Dacheng Tao
VLM
112
33
0
04 Jul 2022
Boosting Single-Frame 3D Object Detection by Simulating Multi-Frame Point Clouds
Wu Zheng
Li Jiang
Fanbin Lu
Yangyang Ye
Chi-Wing Fu
3DPC
ObjD
95
9
0
03 Jul 2022
Boosting 3D Object Detection by Simulating Multimodality on Point Clouds
Wu Zheng
Ming-Hong Hong
Li Jiang
Chi-Wing Fu
3DPC
95
31
0
30 Jun 2022
Parameter-Efficient and Student-Friendly Knowledge Distillation
Jun Rao
Xv Meng
Liang Ding
Shuhan Qi
Dacheng Tao
93
51
0
28 May 2022
Knowledge Distillation via the Target-aware Transformer
Sihao Lin
Hongwei Xie
Bing Wang
Kaicheng Yu
Xiaojun Chang
Xiaodan Liang
G. Wang
ViT
94
107
0
22 May 2022
Knowledge Distillation from A Stronger Teacher
Tao Huang
Shan You
Fei Wang
Chao Qian
Chang Xu
99
258
0
21 May 2022
[Re] Distilling Knowledge via Knowledge Review
Apoorva Verma
Pranjal Gulati
Sarthak Gupta
VLM
32
1
0
18 May 2022
Binarizing by Classification: Is soft function really necessary?
Yefei He
Luoming Zhang
Weijia Wu
Hong Zhou
MQ
116
3
0
16 May 2022
Attention-based Knowledge Distillation in Multi-attention Tasks: The Impact of a DCT-driven Loss
Alejandro López-Cifuentes
Marcos Escudero-Viñolo
Jesús Bescós
Juan C. Sanmiguel
59
1
0
04 May 2022
Masked Generative Distillation
Zhendong Yang
Zhe Li
Mingqi Shao
Dachuan Shi
Zehuan Yuan
Chun Yuan
FedML
98
181
0
03 May 2022
DistPro: Searching A Fast Knowledge Distillation Process via Meta Optimization
XueQing Deng
Dawei Sun
Shawn D. Newsam
Peng Wang
60
9
0
12 Apr 2022
Overcoming Catastrophic Forgetting in Incremental Object Detection via Elastic Response Distillation
Tao Feng
Mang Wang
Hangjie Yuan
ObjD
CLL
105
90
0
05 Apr 2022
Previous
1
2
3
4
5
Next