Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2203.08679
Cited By
v1
v2 (latest)
Decoupled Knowledge Distillation
16 March 2022
Borui Zhao
Quan Cui
Renjie Song
Yiyu Qiu
Jiajun Liang
Re-assign community
ArXiv (abs)
PDF
HTML
Github (855★)
Papers citing
"Decoupled Knowledge Distillation"
50 / 262 papers shown
Title
Ranking Distillation for Open-Ended Video Question Answering with Insufficient Labels
Tianming Liang
Chaolei Tan
Beihao Xia
Wei-Shi Zheng
Jianfang Hu
77
1
0
21 Mar 2024
Scale Decoupled Distillation
Shicai Wei
111
6
0
20 Mar 2024
LNPT: Label-free Network Pruning and Training
Jinying Xiao
Ping Li
Zhe Tang
Jie Nie
67
2
0
19 Mar 2024
Scheduled Knowledge Acquisition on Lightweight Vector Symbolic Architectures for Brain-Computer Interfaces
Yejia Liu
Shijin Duan
Xiaolin Xu
Shaolei Ren
102
1
0
18 Mar 2024
LIX: Implicitly Infusing Spatial Geometric Prior Knowledge into Visual Semantic Segmentation for Autonomous Driving
Sicen Guo
Zhiyuan Wu
Qijun Chen
Ioannis Pitas
Rui Fan
Rui Fan
106
1
0
13 Mar 2024
Robust Synthetic-to-Real Transfer for Stereo Matching
Jiawei Zhang
Jiahe Li
Lei Huang
Xiaohan Yu
Lin Gu
Jin Zheng
Xiao Bai
OOD
82
11
0
12 Mar 2024
V
k
D
:
V_kD:
V
k
D
:
Improving Knowledge Distillation using Orthogonal Projections
Roy Miles
Ismail Elezi
Jiankang Deng
112
10
0
10 Mar 2024
Frequency Attention for Knowledge Distillation
Cuong Pham
Van-Anh Nguyen
Trung Le
Dinh Q. Phung
Gustavo Carneiro
Thanh-Toan Do
73
18
0
09 Mar 2024
PromptKD: Unsupervised Prompt Distillation for Vision-Language Models
Zheng Li
Xiang Li
Xinyi Fu
Xing Zhang
Weiqiang Wang
Shuo Chen
Jian Yang
VLM
104
43
0
05 Mar 2024
Logit Standardization in Knowledge Distillation
Shangquan Sun
Wenqi Ren
Jingzhi Li
Rui Wang
Xiaochun Cao
126
75
0
03 Mar 2024
On the Road to Portability: Compressing End-to-End Motion Planner for Autonomous Driving
Kaituo Feng
Changsheng Li
Dongchun Ren
Ye Yuan
Guoren Wang
106
8
0
02 Mar 2024
A SAM-guided Two-stream Lightweight Model for Anomaly Detection
Chenghao Li
Lei Qi
Xin Geng
112
8
0
29 Feb 2024
Weakly Supervised Monocular 3D Detection with a Single-View Image
Xue-Qiu Jiang
Sheng Jin
Lewei Lu
Xiaoqin Zhang
Shijian Lu
93
6
0
29 Feb 2024
Gradient Reweighting: Towards Imbalanced Class-Incremental Learning
Jiangpeng He
Fengqing Zhu
CLL
99
22
0
28 Feb 2024
Towards Robust and Efficient Cloud-Edge Elastic Model Adaptation via Selective Entropy Distillation
Yaofo Chen
Shuaicheng Niu
Yaowei Wang
Shoukai Xu
Hengjie Song
Mingkui Tan
95
8
0
27 Feb 2024
TIE-KD: Teacher-Independent and Explainable Knowledge Distillation for Monocular Depth Estimation
Sangwon Choi
Daejune Choi
Duksu Kim
65
4
0
22 Feb 2024
GraphKD: Exploring Knowledge Distillation Towards Document Object Detection with Structured Graph Creation
Ayan Banerjee
Sanket Biswas
Josep Lladós
Umapada Pal
117
2
0
17 Feb 2024
On Good Practices for Task-Specific Distillation of Large Pretrained Visual Models
Juliette Marrie
Michael Arbel
Julien Mairal
Diane Larlus
VLM
MQ
90
1
0
17 Feb 2024
Knowledge Distillation Based on Transformed Teacher Matching
Kaixiang Zheng
En-Hui Yang
109
21
0
17 Feb 2024
Data-efficient Large Vision Models through Sequential Autoregression
Jianyuan Guo
Zhiwei Hao
Chengcheng Wang
Yehui Tang
Han Wu
Han Hu
Kai Han
Chang Xu
VLM
106
10
0
07 Feb 2024
Good Teachers Explain: Explanation-Enhanced Knowledge Distillation
Amin Parchami-Araghi
Moritz Bohle
Sukrut Rao
Bernt Schiele
FAtt
59
4
0
05 Feb 2024
Learning from Teaching Regularization: Generalizable Correlations Should be Easy to Imitate
Can Jin
Tong Che
Hongwu Peng
Yiyuan Li
Dimitris N. Metaxas
Marco Pavone
135
47
0
05 Feb 2024
Iterative Data Smoothing: Mitigating Reward Overfitting and Overoptimization in RLHF
Banghua Zhu
Michael I. Jordan
Jiantao Jiao
84
33
0
29 Jan 2024
Rethinking Centered Kernel Alignment in Knowledge Distillation
Zikai Zhou
Yunhang Shen
Shitong Shao
Linrui Gong
Shaohui Lin
101
4
0
22 Jan 2024
Progressive Distillation Based on Masked Generation Feature Method for Knowledge Graph Completion
Cunhang Fan
Yujie Chen
Jun Xue
Yonghui Kong
Jianhua Tao
Zhao Lv
73
3
0
19 Jan 2024
Bayes Conditional Distribution Estimation for Knowledge Distillation Based on Conditional Mutual Information
Linfeng Ye
Shayan Mohajer Hamidi
Renhao Tan
En-Hui Yang
VLM
78
15
0
16 Jan 2024
Generative Denoise Distillation: Simple Stochastic Noises Induce Efficient Knowledge Transfer for Dense Prediction
Zhaoge Liu
Xiaohao Xu
Yunkang Cao
Nong Sang
VLM
79
0
0
16 Jan 2024
Direct Distillation between Different Domains
Jialiang Tang
Shuo Chen
Gang Niu
Hongyuan Zhu
Qiufeng Wang
Chen Gong
Masashi Sugiyama
128
3
0
12 Jan 2024
Source-Free Cross-Modal Knowledge Transfer by Unleashing the Potential of Task-Irrelevant Data
Jinjin Zhu
Yucheng Chen
Lin Wang
86
2
0
10 Jan 2024
Revisiting Knowledge Distillation under Distribution Shift
Songming Zhang
Ziyu Lyu
Xiaofeng Chen
74
1
0
25 Dec 2023
StableKD: Breaking Inter-block Optimization Entanglement for Stable Knowledge Distillation
Shiu-hong Kao
Jierun Chen
S.-H. Gary Chan
73
0
0
20 Dec 2023
Efficient LLM inference solution on Intel GPU
Hui Wu
Yi Gan
Feng Yuan
Jing Ma
Wei Zhu
...
Hong Zhu
Yuhua Zhu
Xiaoli Liu
Jinghui Gu
Peng Zhao
63
3
0
19 Dec 2023
RdimKD: Generic Distillation Paradigm by Dimensionality Reduction
Yi Guo
Yiqian He
Xiaoyang Li
Haotong Qin
Van Tung Pham
Yang Zhang
Shouda Liu
103
1
0
14 Dec 2023
SKDF: A Simple Knowledge Distillation Framework for Distilling Open-Vocabulary Knowledge to Open-world Object Detector
Shuailei Ma
Yuefeng Wang
Ying-yu Wei
Jiaqi Fan
Enming Zhang
Xinyu Sun
Peihao Chen
ObjD
92
1
0
14 Dec 2023
Generative Model-based Feature Knowledge Distillation for Action Recognition
Guiqin Wang
Peng Zhao
Yanjiang Shi
Cong Zhao
Shusen Yang
VLM
72
3
0
14 Dec 2023
SlimSAM: 0.1% Data Makes Segment Anything Slim
Zigeng Chen
Gongfan Fang
Xinyin Ma
Xinchao Wang
103
14
0
08 Dec 2023
EfficientSAM: Leveraged Masked Image Pretraining for Efficient Segment Anything
Yunyang Xiong
Bala Varadarajan
Lemeng Wu
Xiaoyu Xiang
Fanyi Xiao
...
Dilin Wang
Fei Sun
Forrest N. Iandola
Raghuraman Krishnamoorthi
Vikas Chandra
VLM
107
158
0
01 Dec 2023
Generalized Large-Scale Data Condensation via Various Backbone and Statistical Matching
Shitong Shao
Zeyuan Yin
Muxin Zhou
Xindong Zhang
Zhiqiang Shen
DD
111
30
0
29 Nov 2023
SpliceMix: A Cross-scale and Semantic Blending Augmentation Strategy for Multi-label Image Classification
Lei Wang
Yibing Zhan
Leilei Ma
Dapeng Tao
Liang Ding
Chen Gong
77
1
0
26 Nov 2023
Cosine Similarity Knowledge Distillation for Individual Class Information Transfer
Gyeongdo Ham
Seonghak Kim
Suin Lee
Jae-Hyeok Lee
Daeshik Kim
55
6
0
24 Nov 2023
Maximizing Discrimination Capability of Knowledge Distillation with Energy Function
Seonghak Kim
Gyeongdo Ham
Suin Lee
Donggon Jang
Daeshik Kim
229
4
0
24 Nov 2023
Robustness-Reinforced Knowledge Distillation with Correlation Distance and Network Pruning
Seonghak Kim
Gyeongdo Ham
Yucheol Cho
Daeshik Kim
114
4
0
23 Nov 2023
One-for-All: Bridge the Gap Between Heterogeneous Architectures in Knowledge Distillation
Zhiwei Hao
Jianyuan Guo
Kai Han
Yehui Tang
Han Hu
Yunhe Wang
Chang Xu
105
72
0
30 Oct 2023
Understanding the Effects of Projectors in Knowledge Distillation
Yudong Chen
Sen Wang
Jiajun Liu
Xuwei Xu
Frank de Hoog
Brano Kusy
Zi Huang
94
0
0
26 Oct 2023
Towards the Fundamental Limits of Knowledge Transfer over Finite Domains
Qingyue Zhao
Banghua Zhu
86
4
0
11 Oct 2023
Distilling Efficient Vision Transformers from CNNs for Semantic Segmentation
Xueye Zheng
Yunhao Luo
Pengyuan Zhou
Lin Wang
79
15
0
11 Oct 2023
Bidirectional Knowledge Reconfiguration for Lightweight Point Cloud Analysis
Peipei Li
Xing Cui
Yibo Hu
Man Zhang
Ting Yao
Tao Mei
83
0
0
08 Oct 2023
LumiNet: The Bright Side of Perceptual Knowledge Distillation
Md. Ismail Hossain
M. M. L. Elahi
Sameera Ramasinghe
A. Cheraghian
Fuad Rahman
Nabeel Mohammed
Shafin Rahman
67
1
0
05 Oct 2023
Improving Knowledge Distillation with Teacher's Explanation
S. Chowdhury
Ben Liang
A. Tizghadam
Ilijc Albanese
FAtt
28
0
0
04 Oct 2023
NAYER: Noisy Layer Data Generation for Efficient and Effective Data-free Knowledge Distillation
Minh-Tuan Tran
Trung Le
Xuan-May Le
Mehrtash Harandi
Quan Hung Tran
Dinh Q. Phung
82
13
0
30 Sep 2023
Previous
1
2
3
4
5
6
Next