Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2304.12777
Cited By
Class Attention Transfer Based Knowledge Distillation
25 April 2023
Ziyao Guo
Haonan Yan
Hui Li
Xiao-La Lin
Re-assign community
ArXiv (abs)
PDF
HTML
Github (44★)
Papers citing
"Class Attention Transfer Based Knowledge Distillation"
31 / 31 papers shown
Title
DeepKD: A Deeply Decoupled and Denoised Knowledge Distillation Trainer
Haiduo Huang
Jiangcheng Song
Yadong Zhang
Pengju Ren
69
0
0
21 May 2025
FiGKD: Fine-Grained Knowledge Distillation via High-Frequency Detail Transfer
Seonghak Kim
121
0
0
17 May 2025
DCSNet: A Lightweight Knowledge Distillation-Based Model with Explainable AI for Lung Cancer Diagnosis from Histopathological Images
Sadman Sakib Alif
Nasim Anzum Promise
Fiaz Al Abid
Aniqua Nusrat Zereen
63
0
0
14 May 2025
CAE-DFKD: Bridging the Transferability Gap in Data-Free Knowledge Distillation
Zherui Zhang
Changwei Wang
Rongtao Xu
Wenyuan Xu
Shibiao Xu
Yu Zhang
Li Guo
108
1
0
30 Apr 2025
Swapped Logit Distillation via Bi-level Teacher Alignment
Stephen Ekaputra Limantoro
Jhe-Hao Lin
Chih-Yu Wang
Yi-Lung Tsai
Hong-Han Shuai
Ching-Chun Huang
Wen-Huang Cheng
164
0
0
27 Apr 2025
HDC: Hierarchical Distillation for Multi-level Noisy Consistency in Semi-Supervised Fetal Ultrasound Segmentation
Tran Quoc Khanh Le
Nguyen Lan Vi Vu
Ha-Hieu Pham
Xuan-Loc Huynh
T. Nguyen
Minh Huu Nhat Le
Quan Nguyen
Hien Nguyen
83
0
0
14 Apr 2025
Adaptive Temperature Based on Logits Correlation in Knowledge Distillation
Kazuhiro Matsuyama
Usman Anjum
Satoko Matsuyama
Tetsuo Shoda
J. Zhan
141
0
0
12 Mar 2025
Task-Specific Knowledge Distillation from the Vision Foundation Model for Enhanced Medical Image Segmentation
Pengchen Liang
Haishan Huang
Bin Pu
Jianguo Chen
Xiang Hua
Jing Zhang
Weibo Ma
Z. Chen
Yiwei Li
Qing Chang
78
0
0
10 Mar 2025
VRM: Knowledge Distillation via Virtual Relation Matching
W. Zhang
Fei Xie
Weidong Cai
Chao Ma
207
0
0
28 Feb 2025
Multi-Level Decoupled Relational Distillation for Heterogeneous Architectures
Yaoxin Yang
Peng Ye
Weihao Lin
Kangcong Li
Yan Wen
Jia Hao
Tao Chen
92
0
0
10 Feb 2025
Contrastive Representation Distillation via Multi-Scale Feature Decoupling
Cuipeng Wang
Tieyuan Chen
Haipeng Wang
103
0
0
09 Feb 2025
Rethinking Knowledge in Distillation: An In-context Sample Retrieval Perspective
Jinjing Zhu
Songze Li
Lin Wang
97
0
0
13 Jan 2025
Wasserstein Distance Rivals Kullback-Leibler Divergence for Knowledge Distillation
Jiaming Lv
Haoyuan Yang
P. Li
154
2
0
11 Dec 2024
Dynamic Textual Prompt For Rehearsal-free Lifelong Person Re-identification
Hongyu Chen
Bingliang Jiao
Wenxuan Wang
Peng Wang
VLM
82
0
0
09 Nov 2024
Gap Preserving Distillation by Building Bidirectional Mappings with A Dynamic Teacher
Yong Guo
Shulian Zhang
Haolin Pan
Jing Liu
Yulun Zhang
Jian Chen
81
0
0
05 Oct 2024
Enhancing Logits Distillation with Plug\&Play Kendall's
τ
τ
τ
Ranking Loss
Yuchen Guan
Runxi Cheng
Kang Liu
Chun Yuan
86
0
0
26 Sep 2024
Learn from Balance: Rectifying Knowledge Transfer for Long-Tailed Scenarios
Xinlei Huang
Jialiang Tang
Xubin Zheng
Jinjia Zhou
Wenxin Yu
Ning Jiang
85
0
0
12 Sep 2024
DisCoM-KD: Cross-Modal Knowledge Distillation via Disentanglement Representation and Adversarial Learning
Dino Ienco
C. Dantas
83
4
0
05 Aug 2024
Disentangling spatio-temporal knowledge for weakly supervised object detection and segmentation in surgical video
Guiqiu Liao
M. Jogan
Sai Koushik
Eric Eaton
Daniel A. Hashimoto
VOS
99
2
0
22 Jul 2024
Instance Temperature Knowledge Distillation
Zhengbo Zhang
Yuxi Zhou
Jia Gong
Jun Liu
Zhigang Tu
108
2
0
27 Jun 2024
Estimating Human Poses Across Datasets: A Unified Skeleton and Multi-Teacher Distillation Approach
Muhammad Gul Zain Ali Khan
Dhavalkumar Limbachiya
Didier Stricker
Muhammad Zeshan Afzal
3DH
91
0
0
30 May 2024
Aligning in a Compact Space: Contrastive Knowledge Distillation between Heterogeneous Architectures
Hongjun Wu
Li Xiao
Xingkuo Zhang
Yining Miao
103
1
0
28 May 2024
CKD: Contrastive Knowledge Distillation from A Sample-wise Perspective
Wencheng Zhu
Xin Zhou
Pengfei Zhu
Yu Wang
Qinghua Hu
VLM
136
1
0
22 Apr 2024
A Comprehensive Review of Knowledge Distillation in Computer Vision
Sheikh Musa Kaleem
Tufail Rouf
Gousia Habib
Tausifa Jan Saleem
Brejesh Lall
VLM
57
13
0
01 Apr 2024
Logit Standardization in Knowledge Distillation
Shangquan Sun
Wenqi Ren
Jingzhi Li
Rui Wang
Xiaochun Cao
126
74
0
03 Mar 2024
Good Teachers Explain: Explanation-Enhanced Knowledge Distillation
Amin Parchami-Araghi
Moritz Bohle
Sukrut Rao
Bernt Schiele
FAtt
59
4
0
05 Feb 2024
Online Robot Navigation and Manipulation with Distilled Vision-Language Models
Kangcheng Liu
44
0
0
30 Jan 2024
Direct Distillation between Different Domains
Jialiang Tang
Shuo Chen
Gang Niu
Hongyuan Zhu
Qiufeng Wang
Chen Gong
Masashi Sugiyama
128
3
0
12 Jan 2024
Maximizing Discrimination Capability of Knowledge Distillation with Energy Function
Seonghak Kim
Gyeongdo Ham
Suin Lee
Donggon Jang
Daeshik Kim
229
4
0
24 Nov 2023
Review helps learn better: Temporal Supervised Knowledge Distillation
Dongwei Wang
Zhi Han
Yanmei Wang
Xi’ai Chen
Baichen Liu
Yandong Tang
144
1
0
03 Jul 2023
Contrastive Representation Distillation
Yonglong Tian
Dilip Krishnan
Phillip Isola
201
1,057
0
23 Oct 2019
1