Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
1904.01866
Cited By
A Comprehensive Overhaul of Feature Distillation
3 April 2019
Byeongho Heo
Jeesoo Kim
Sangdoo Yun
Hyojin Park
Nojun Kwak
J. Choi
Re-assign community
ArXiv
PDF
HTML
Papers citing
"A Comprehensive Overhaul of Feature Distillation"
50 / 125 papers shown
Title
Exploring Content Relationships for Distilling Efficient GANs
Lizhou You
Mingbao Lin
Tie Hu
Rongrong Ji
Rongrong Ji
49
3
0
21 Dec 2022
3D Point Cloud Pre-training with Knowledge Distillation from 2D Images
Yuan Yao
Yuanhan Zhang
Zhen-fei Yin
Jiebo Luo
Wanli Ouyang
Xiaoshui Huang
3DPC
29
10
0
17 Dec 2022
Attention-Based Depth Distillation with 3D-Aware Positional Encoding for Monocular 3D Object Detection
Zizhang Wu
Yunzhe Wu
Jian Pu
Xianzhi Li
Xiaoquan Wang
32
14
0
30 Nov 2022
Rethinking Implicit Neural Representations for Vision Learners
Yiran Song
Qianyu Zhou
Lizhuang Ma
24
7
0
22 Nov 2022
Knowledge Distillation for Detection Transformer with Consistent Distillation Points Sampling
Yu Wang
Xin Li
Shengzhao Wen
Fu-En Yang
Wanping Zhang
Gang Zhang
Haocheng Feng
Junyu Han
Errui Ding
47
5
0
15 Nov 2022
Structured Knowledge Distillation Towards Efficient and Compact Multi-View 3D Detection
Linfeng Zhang
Yukang Shi
Hung-Shuo Tai
Zhipeng Zhang
Yuan He
Ke Wang
Kaisheng Ma
28
2
0
14 Nov 2022
Pixel-Wise Contrastive Distillation
Junqiang Huang
Zichao Guo
47
4
0
01 Nov 2022
Improved Feature Distillation via Projector Ensemble
Yudong Chen
Sen Wang
Jiajun Liu
Xuwei Xu
Frank de Hoog
Zi Huang
39
38
0
27 Oct 2022
Collaborative Multi-Teacher Knowledge Distillation for Learning Low Bit-width Deep Neural Networks
Cuong Pham
Tuan Hoang
Thanh-Toan Do
FedML
MQ
40
14
0
27 Oct 2022
Respecting Transfer Gap in Knowledge Distillation
Yulei Niu
Long Chen
Chan Zhou
Hanwang Zhang
26
23
0
23 Oct 2022
Boosting Graph Neural Networks via Adaptive Knowledge Distillation
Zhichun Guo
Chunhui Zhang
Yujie Fan
Yijun Tian
Chuxu Zhang
Nitesh Chawla
26
32
0
12 Oct 2022
Bi-directional Weakly Supervised Knowledge Distillation for Whole Slide Image Classification
Linhao Qu
Xiao-Zhuo Luo
Manning Wang
Zhijian Song
WSOD
31
59
0
07 Oct 2022
Generative Adversarial Super-Resolution at the Edge with Knowledge Distillation
Simone Angarano
Francesco Salvetti
Mauro Martini
Marcello Chiaberge
GAN
51
21
0
07 Sep 2022
Masked Autoencoders Enable Efficient Knowledge Distillers
Yutong Bai
Zeyu Wang
Junfei Xiao
Chen Wei
Huiyu Wang
Alan Yuille
Yuyin Zhou
Cihang Xie
CLL
32
40
0
25 Aug 2022
Rethinking Knowledge Distillation via Cross-Entropy
Zhendong Yang
Zhe Li
Yuan Gong
Tianke Zhang
Shanshan Lao
Chun Yuan
Yu Li
33
14
0
22 Aug 2022
Lipschitz Continuity Retained Binary Neural Network
Yuzhang Shang
Dan Xu
Bin Duan
Ziliang Zong
Liqiang Nie
Yan Yan
16
19
0
13 Jul 2022
Knowledge Condensation Distillation
Chenxin Li
Mingbao Lin
Zhiyuan Ding
Nie Lin
Yihong Zhuang
Yue Huang
Xinghao Ding
Liujuan Cao
42
28
0
12 Jul 2022
Normalized Feature Distillation for Semantic Segmentation
Tao Liu
Xi Yang
Chenshu Chen
9
5
0
12 Jul 2022
ACT-Net: Asymmetric Co-Teacher Network for Semi-supervised Memory-efficient Medical Image Segmentation
Ziyuan Zhao
An Zhu
Zeng Zeng
B. Veeravalli
Cuntai Guan
27
9
0
05 Jul 2022
Boosting Single-Frame 3D Object Detection by Simulating Multi-Frame Point Clouds
Wu Zheng
Li Jiang
Fanbin Lu
Yangyang Ye
Chi-Wing Fu
3DPC
ObjD
46
9
0
03 Jul 2022
Boosting 3D Object Detection by Simulating Multimodality on Point Clouds
Wu Zheng
Ming-Hong Hong
Li Jiang
Chi-Wing Fu
3DPC
44
31
0
30 Jun 2022
Revisiting Label Smoothing and Knowledge Distillation Compatibility: What was Missing?
Keshigeyan Chandrasegaran
Ngoc-Trung Tran
Yunqing Zhao
Ngai-man Cheung
93
41
0
29 Jun 2022
Revisiting Architecture-aware Knowledge Distillation: Smaller Models and Faster Search
Taehyeon Kim
Heesoo Myeong
Se-Young Yun
37
2
0
27 Jun 2022
Unifying Voxel-based Representation with Transformer for 3D Object Detection
Yanwei Li
Yilun Chen
Xiaojuan Qi
Zeming Li
Jian Sun
Jiaya Jia
ViT
27
250
0
01 Jun 2022
PointDistiller: Structured Knowledge Distillation Towards Efficient and Compact 3D Detection
Linfeng Zhang
Runpei Dong
Hung-Shuo Tai
Kaisheng Ma
3DPC
72
47
0
23 May 2022
Knowledge Distillation via the Target-aware Transformer
Sihao Lin
Hongwei Xie
Bing Wang
Kaicheng Yu
Xiaojun Chang
Xiaodan Liang
G. Wang
ViT
22
104
0
22 May 2022
Knowledge Distillation from A Stronger Teacher
Tao Huang
Shan You
Fei Wang
Chao Qian
Chang Xu
35
238
0
21 May 2022
[Re] Distilling Knowledge via Knowledge Review
Apoorva Verma
Pranjal Gulati
Sarthak Gupta
VLM
24
0
0
18 May 2022
Knowledge Distillation Meets Open-Set Semi-Supervised Learning
Jing Yang
Xiatian Zhu
Adrian Bulat
Brais Martínez
Georgios Tzimiropoulos
40
8
0
13 May 2022
Generalized Knowledge Distillation via Relationship Matching
Han-Jia Ye
Su Lu
De-Chuan Zhan
FedML
22
20
0
04 May 2022
Masked Generative Distillation
Zhendong Yang
Zhe Li
Mingqi Shao
Dachuan Shi
Zehuan Yuan
Chun Yuan
FedML
38
169
0
03 May 2022
Cross-Image Relational Knowledge Distillation for Semantic Segmentation
Chuanguang Yang
Helong Zhou
Zhulin An
Xue Jiang
Yong Xu
Qian Zhang
42
170
0
14 Apr 2022
LiDAR Distillation: Bridging the Beam-Induced Domain Gap for 3D Object Detection
Yi Wei
Zibu Wei
Yongming Rao
Jiaxin Li
Jie Zhou
Jiwen Lu
58
63
0
28 Mar 2022
Knowledge Distillation as Efficient Pre-training: Faster Convergence, Higher Data-efficiency, and Better Transferability
Ruifei He
Shuyang Sun
Jihan Yang
Song Bai
Xiaojuan Qi
37
36
0
10 Mar 2022
Exploring Inter-Channel Correlation for Diversity-preserved KnowledgeDistillation
Li Liu
Qingle Huang
Sihao Lin
Hongwei Xie
Bing Wang
Xiaojun Chang
Xiao-Xue Liang
28
100
0
08 Feb 2022
Data-Free Knowledge Transfer: A Survey
Yuang Liu
Wei Zhang
Jun Wang
Jianyong Wang
40
48
0
31 Dec 2021
ESGN: Efficient Stereo Geometry Network for Fast 3D Object Detection
Aqi Gao
Yanwei Pang
Jing Nie
Jiale Cao
Yishun Guo
3DPC
19
15
0
28 Nov 2021
EvDistill: Asynchronous Events to End-task Learning via Bidirectional Reconstruction-guided Cross-modal Knowledge Distillation
Lin Wang
Yujeong Chae
Sung-Hoon Yoon
Tae-Kyun Kim
Kuk-Jin Yoon
47
64
0
24 Nov 2021
Local-Selective Feature Distillation for Single Image Super-Resolution
Seonguk Park
Nojun Kwak
24
9
0
22 Nov 2021
MixACM: Mixup-Based Robustness Transfer via Distillation of Activated Channel Maps
Muhammad Awais
Fengwei Zhou
Chuanlong Xie
Jiawei Li
Sung-Ho Bae
Zhenguo Li
AAML
43
17
0
09 Nov 2021
PP-ShiTu: A Practical Lightweight Image Recognition System
Shengyun Wei
Ruoyu Guo
Cheng Cui
Bin Lu
Shuilong Dong
...
Xueying Lyu
Qiwen Liu
Xiaoguang Hu
Dianhai Yu
Yanjun Ma
CVBM
26
6
0
01 Nov 2021
Distilling Object Detectors with Feature Richness
Zhixing Du
Rui Zhang
Ming-Fang Chang
Xishan Zhang
Shaoli Liu
Tianshi Chen
Yunji Chen
ObjD
24
74
0
01 Nov 2021
Adaptive Distillation: Aggregating Knowledge from Multiple Paths for Efficient Distillation
Sumanth Chennupati
Mohammad Mahdi Kamani
Zhongwei Cheng
Lin Chen
35
4
0
19 Oct 2021
Dual Transfer Learning for Event-based End-task Prediction via Pluggable Event to Image Translation
Lin Wang
Yujeong Chae
Kuk-Jin Yoon
32
32
0
04 Sep 2021
LIGA-Stereo: Learning LiDAR Geometry Aware Representations for Stereo-based 3D Detector
Xiaoyang Guo
Shaoshuai Shi
Xiaogang Wang
Hongsheng Li
3DPC
36
106
0
18 Aug 2021
PQK: Model Compression via Pruning, Quantization, and Knowledge Distillation
Jang-Hyun Kim
Simyung Chang
Nojun Kwak
30
44
0
25 Jun 2021
Teacher's pet: understanding and mitigating biases in distillation
Michal Lukasik
Srinadh Bhojanapalli
A. Menon
Sanjiv Kumar
18
25
0
19 Jun 2021
Multi-Target Domain Adaptation with Collaborative Consistency Learning
Takashi Isobe
Xu Jia
Shuaijun Chen
Jianzhong He
Yongjie Shi
Jian-zhuo Liu
Huchuan Lu
Shengjin Wang
35
85
0
07 Jun 2021
Fast Camera Image Denoising on Mobile GPUs with Deep Learning, Mobile AI 2021 Challenge: Report
Andrey D. Ignatov
Kim Byeoung-su
Radu Timofte
Angeline Pouget
Fenglong Song
...
Lei Lei
Chaoyu Feng
L. Huang
Z. Lei
Feifei Chen
25
30
0
17 May 2021
Distilling Knowledge via Knowledge Review
Pengguang Chen
Shu Liu
Hengshuang Zhao
Jiaya Jia
155
424
0
19 Apr 2021
Previous
1
2
3
Next