Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2203.08679
Cited By
v1
v2 (latest)
Decoupled Knowledge Distillation
16 March 2022
Borui Zhao
Quan Cui
Renjie Song
Yiyu Qiu
Jiajun Liang
Re-assign community
ArXiv (abs)
PDF
HTML
Github (855★)
Papers citing
"Decoupled Knowledge Distillation"
50 / 262 papers shown
Title
KDMOS:Knowledge Distillation for Motion Segmentation
Chunyu Cao
Jintao Cheng
Zeyu Chen
Linfan Zhan
Rui Fan
Zhijian He
Xiaoyu Tang
23
0
0
17 Jun 2025
Shapley Machine: A Game-Theoretic Framework for N-Agent Ad Hoc Teamwork
Jianhong Wang
Yang Li
Samuel Kaski
Jonathan Lawry
18
0
0
12 Jun 2025
Data-Efficient Challenges in Visual Inductive Priors: A Retrospective
Robert-Jan Bruintjes
A. Lengyel
O. Kayhan
Davide Zambrano
Nergis Tomen
Hadi Jamali Rad
Jan van Gemert
VLM
31
0
0
10 Jun 2025
Progressive Class-level Distillation
JiaYan Li
Jun Li
Zhourui Zhang
Jianhua Xu
30
0
0
30 May 2025
AutoReproduce: Automatic AI Experiment Reproduction with Paper Lineage
Xuanle Zhao
Zilin Sang
Yuxuan Li
Qi Shi
Shuo Wang
Duzhen Zhang
Xu Han
Zhiyuan Liu
Maosong Sun
Maosong Sun
75
1
0
27 May 2025
Single Domain Generalization for Few-Shot Counting via Universal Representation Matching
Xianing Chen
Si Huo
Borui Jiang
Hailin Hu
Xinghao Chen
OOD
79
0
0
22 May 2025
DeepKD: A Deeply Decoupled and Denoised Knowledge Distillation Trainer
Haiduo Huang
Jiangcheng Song
Yadong Zhang
Pengju Ren
69
0
0
21 May 2025
Collaborative Unlabeled Data Optimization
Xinyi Shang
Peng Sun
Fengyuan Liu
Tao Lin
78
0
0
20 May 2025
Equally Critical: Samples, Targets, and Their Mappings in Datasets
Runkang Yang
Peng Sun
Xinyi Shang
Yi Tang
Tao R. Lin
24
0
0
17 May 2025
FiGKD: Fine-Grained Knowledge Distillation via High-Frequency Detail Transfer
Seonghak Kim
121
0
0
17 May 2025
MoKD: Multi-Task Optimization for Knowledge Distillation
Zeeshan Hayder
A. Cheraghian
Lars Petersson
Mehrtash Harandi
VLM
144
0
0
13 May 2025
Uncertainty-Aware Multi-Expert Knowledge Distillation for Imbalanced Disease Grading
Shuo Tong
Shangde Gao
Ke Liu
Zihang Huang
Hongxia Xu
Haochao Ying
Jian Wu
79
0
0
01 May 2025
Head-Tail-Aware KL Divergence in Knowledge Distillation for Spiking Neural Networks
Tianqing Zhang
Zixin Zhu
Kairong Yu
Hongwei Wang
468
0
0
29 Apr 2025
Swapped Logit Distillation via Bi-level Teacher Alignment
Stephen Ekaputra Limantoro
Jhe-Hao Lin
Chih-Yu Wang
Yi-Lung Tsai
Hong-Han Shuai
Ching-Chun Huang
Wen-Huang Cheng
164
0
0
27 Apr 2025
Feature Alignment and Representation Transfer in Knowledge Distillation for Large Language Models
Junjie Yang
Junhao Song
Xudong Han
Ziqian Bi
Tianyang Wang
...
Yize Zhang
Qian Niu
Benji Peng
Keyu Chen
Ming Liu
VLM
99
0
0
18 Apr 2025
Random Conditioning with Distillation for Data-Efficient Diffusion Model Compression
Dohyun Kim
S. Park
Geonhee Han
Seung Wook Kim
Paul Hongsuck Seo
DiffM
106
0
0
02 Apr 2025
Sample-level Adaptive Knowledge Distillation for Action Recognition
Ping Li
Chenhao Ping
Wenxiao Wang
Mingli Song
138
0
0
01 Apr 2025
Decoupled Distillation to Erase: A General Unlearning Method for Any Class-centric Tasks
Yu Zhou
Dian Zheng
Qijie Mo
Renjie Lu
Kun-Yu Lin
Wei-Shi Zheng
MU
137
2
0
31 Mar 2025
Delving Deep into Semantic Relation Distillation
Zhaoyi Yan
Kangjun Liu
Qixiang Ye
86
0
0
27 Mar 2025
CustomKD: Customizing Large Vision Foundation for Edge Model Improvement via Knowledge Distillation
Jungsoo Lee
Debasmit Das
Munawar Hayat
Sungha Choi
Kyuwoong Hwang
Fatih Porikli
VLM
110
1
0
23 Mar 2025
Cross-Modal and Uncertainty-Aware Agglomeration for Open-Vocabulary 3D Scene Understanding
Jinlong Li
Cristiano Saltori
Fabio Poiesi
N. Sebe
494
2
0
20 Mar 2025
SCJD: Sparse Correlation and Joint Distillation for Efficient 3D Human Pose Estimation
Weihong Chen
Xuemiao Xu
Haoxin Yang
Yi Xie
Peng Xiao
Cheng Xu
Huaidong Zhang
Pheng-Ann Heng
3DH
110
0
0
18 Mar 2025
Adaptive Temperature Based on Logits Correlation in Knowledge Distillation
Kazuhiro Matsuyama
Usman Anjum
Satoko Matsuyama
Tetsuo Shoda
J. Zhan
141
0
0
12 Mar 2025
AugFL: Augmenting Federated Learning with Pretrained Models
Sheng Yue
Zerui Qin
Yongheng Deng
Ju Ren
Yaoxue Zhang
Junshan Zhang
FedML
135
0
0
04 Mar 2025
VRM: Knowledge Distillation via Virtual Relation Matching
W. Zhang
Fei Xie
Weidong Cai
Chao Ma
210
0
0
28 Feb 2025
Multi-Level Decoupled Relational Distillation for Heterogeneous Architectures
Yaoxin Yang
Peng Ye
Weihao Lin
Kangcong Li
Yan Wen
Jia Hao
Tao Chen
94
0
0
10 Feb 2025
Contrastive Representation Distillation via Multi-Scale Feature Decoupling
Cuipeng Wang
Tieyuan Chen
Haipeng Wang
103
0
0
09 Feb 2025
TAID: Temporally Adaptive Interpolated Distillation for Efficient Knowledge Transfer in Language Models
Makoto Shing
Kou Misaki
Han Bao
Sho Yokoi
Takuya Akiba
VLM
126
4
0
28 Jan 2025
Rethinking Knowledge in Distillation: An In-context Sample Retrieval Perspective
Jinjing Zhu
Songze Li
Lin Wang
101
0
0
13 Jan 2025
ECG-guided individual identification via PPG
Riling Wei
Hanjie Chen
Kelu Yao
Chuanguang Yang
Jun Wang
Chao Li
72
0
0
30 Dec 2024
Data Pruning Can Do More: A Comprehensive Data Pruning Approach for Object Re-identification
Zi Yang
Haojin Yang
Soumajit Majumder
Jorge M. Cardoso
Guillermo Gallego
MoMe
VLM
176
2
0
13 Dec 2024
Wasserstein Distance Rivals Kullback-Leibler Divergence for Knowledge Distillation
Jiaming Lv
Haoyuan Yang
P. Li
162
2
0
11 Dec 2024
Federated Progressive Self-Distillation with Logits Calibration for Personalized IIoT Edge Intelligence
Yingchao Wang
Wenqi Niu
135
0
0
30 Nov 2024
Large-Scale Data-Free Knowledge Distillation for ImageNet via Multi-Resolution Data Generation
Minh-Tuan Tran
Trung Le
Xuan-May Le
Jianfei Cai
Mehrtash Harandi
Dinh Q. Phung
147
2
0
26 Nov 2024
Map-Free Trajectory Prediction with Map Distillation and Hierarchical Encoding
Xiaodong Liu
Yucheng Xing
Xin Wang
87
0
0
17 Nov 2024
Dual-Head Knowledge Distillation: Enhancing Logits Utilization with an Auxiliary Head
Penghui Yang
Chen-Chen Zong
Sheng-Jun Huang
Lei Feng
Bo An
141
1
0
13 Nov 2024
Over-parameterized Student Model via Tensor Decomposition Boosted Knowledge Distillation
Yu-Liang Zhan
Zhong-Yi Lu
Hao Sun
Ze-Feng Gao
75
0
0
10 Nov 2024
Dynamic Textual Prompt For Rehearsal-free Lifelong Person Re-identification
Hongyu Chen
Bingliang Jiao
Wenxuan Wang
Peng Wang
VLM
82
0
0
09 Nov 2024
Decoupling Dark Knowledge via Block-wise Logit Distillation for Feature-level Alignment
Chengting Yu
Fengzhao Zhang
Ruizhe Chen
Zuozhu Liu
Shurun Tan
Er-ping Li
Aili Wang
88
2
0
03 Nov 2024
Preview-based Category Contrastive Learning for Knowledge Distillation
Muhe Ding
Jianlong Wu
Xue Dong
Xiaojie Li
Pengda Qin
Tian Gan
Liqiang Nie
VLM
91
0
0
18 Oct 2024
CAKD: A Correlation-Aware Knowledge Distillation Framework Based on Decoupling Kullback-Leibler Divergence
Zao Zhang
Huaming Chen
Pei Ning
Nan Yang
Dong Yuan
61
1
0
17 Oct 2024
TAS: Distilling Arbitrary Teacher and Student via a Hybrid Assistant
Guopeng Li
Qiang Wang
K. Yan
Shouhong Ding
Yuan Gao
Gui-Song Xia
86
0
0
16 Oct 2024
Efficient and Robust Knowledge Distillation from A Stronger Teacher Based on Correlation Matching
Wenqi Niu
Yingchao Wang
Guohui Cai
Hanpo Hou
54
1
0
09 Oct 2024
Gap Preserving Distillation by Building Bidirectional Mappings with A Dynamic Teacher
Yong Guo
Shulian Zhang
Haolin Pan
Jing Liu
Yulun Zhang
Jian Chen
84
0
0
05 Oct 2024
Linear Projections of Teacher Embeddings for Few-Class Distillation
Noel Loo
Fotis Iliopoulos
Wei Hu
Erik Vee
73
0
0
30 Sep 2024
Harmonizing knowledge Transfer in Neural Network with Unified Distillation
Yaomin Huang
Zaomin Yan
Yaxin Peng
Faming Fang
Guixu Zhang
91
0
0
27 Sep 2024
ReliOcc: Towards Reliable Semantic Occupancy Prediction via Uncertainty Learning
Song Wang
Zhongdao Wang
Jiawei Yu
Wentong Li
Bailan Feng
Junbo Chen
Jianke Zhu
UQCV
84
3
0
26 Sep 2024
Enhancing Logits Distillation with Plug\&Play Kendall's
τ
τ
τ
Ranking Loss
Yuchen Guan
Runxi Cheng
Kang Liu
Chun Yuan
86
0
0
26 Sep 2024
Towards Model-Agnostic Dataset Condensation by Heterogeneous Models
Jun-Yeong Moon
Jung Uk Kim
Gyeong-Moon Park
DD
72
1
0
22 Sep 2024
Learn from Balance: Rectifying Knowledge Transfer for Long-Tailed Scenarios
Xinlei Huang
Jialiang Tang
Xubin Zheng
Jinjia Zhou
Wenxin Yu
Ning Jiang
85
0
0
12 Sep 2024
1
2
3
4
5
6
Next