Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2102.02973
Cited By
Show, Attend and Distill:Knowledge Distillation via Attention-based Feature Matching
5 February 2021
Mingi Ji
Byeongho Heo
Sungrae Park
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Show, Attend and Distill:Knowledge Distillation via Attention-based Feature Matching"
25 / 25 papers shown
Title
Selective Structured State Space for Multispectral-fused Small Target Detection
Qianqian Zhang
WeiJun Wang
Yunxing Liu
Li Zhou
Hao Zhao
Junshe An
Zihan Wang
Mamba
62
0
0
20 May 2025
DCSNet: A Lightweight Knowledge Distillation-Based Model with Explainable AI for Lung Cancer Diagnosis from Histopathological Images
Sadman Sakib Alif
Nasim Anzum Promise
Fiaz Al Abid
Aniqua Nusrat Zereen
37
0
0
14 May 2025
Image Recognition with Online Lightweight Vision Transformer: A Survey
Zherui Zhang
Rongtao Xu
Jie Zhou
Changwei Wang
Xingtian Pei
...
Jiguang Zhang
Li Guo
Longxiang Gao
Wenyuan Xu
Shibiao Xu
ViT
301
0
0
06 May 2025
Indirect Gradient Matching for Adversarial Robust Distillation
Hongsin Lee
Seungju Cho
Changick Kim
AAML
FedML
55
2
0
06 Dec 2023
Understanding the Effects of Projectors in Knowledge Distillation
Yudong Chen
Sen Wang
Jiajun Liu
Xuwei Xu
Frank de Hoog
Brano Kusy
Zi Huang
42
0
0
26 Oct 2023
Bidirectional Knowledge Reconfiguration for Lightweight Point Cloud Analysis
Peipei Li
Xing Cui
Yibo Hu
Man Zhang
Ting Yao
Tao Mei
38
0
0
08 Oct 2023
Review helps learn better: Temporal Supervised Knowledge Distillation
Dongwei Wang
Zhi Han
Yanmei Wang
Xi’ai Chen
Baichen Liu
Yandong Tang
62
1
0
03 Jul 2023
Filter Pruning for Efficient CNNs via Knowledge-driven Differential Filter Sampler
Shaohui Lin
Wenxuan Huang
Jiao Xie
Baochang Zhang
Yunhang Shen
Zhou Yu
Jungong Han
David Doermann
33
2
0
01 Jul 2023
Cross Architecture Distillation for Face Recognition
Weisong Zhao
Xiangyu Zhu
Zhixiang He
Xiaoyu Zhang
Zhen Lei
CVBM
45
6
0
26 Jun 2023
Student-friendly Knowledge Distillation
Mengyang Yuan
Bo Lang
Fengnan Quan
48
18
0
18 May 2023
Function-Consistent Feature Distillation
Dongyang Liu
Meina Kan
Shiguang Shan
Xilin Chen
77
19
0
24 Apr 2023
Semantic Scene Completion with Cleaner Self
Fengyun Wang
Dong Zhang
Hanwang Zhang
Jinhui Tang
Qianru Sun
36
12
0
17 Mar 2023
hierarchical network with decoupled knowledge distillation for speech emotion recognition
Ziping Zhao
Haiquan Wang
Haishuai Wang
Bjorn Schuller
32
5
0
09 Mar 2023
Guided Hybrid Quantization for Object detection in Multimodal Remote Sensing Imagery via One-to-one Self-teaching
Jiaqing Zhang
Jie Lei
Weiying Xie
Yunsong Li
Wenxuan Wang
MQ
53
19
0
31 Dec 2022
Hilbert Distillation for Cross-Dimensionality Networks
Dian Qin
Haishuai Wang
Zhe Liu
Hongjia Xu
Sheng Zhou
Jiajun Bu
28
4
0
08 Nov 2022
Improved Feature Distillation via Projector Ensemble
Yudong Chen
Sen Wang
Jiajun Liu
Xuwei Xu
Frank de Hoog
Zi Huang
44
38
0
27 Oct 2022
Variance Tolerance Factors For Interpreting ALL Neural Networks
Sichao Li
Amanda S. Barnard
FAtt
45
3
0
28 Sep 2022
Knowledge Distillation via the Target-aware Transformer
Sihao Lin
Hongwei Xie
Bing Wang
Kaicheng Yu
Xiaojun Chang
Xiaodan Liang
G. Wang
ViT
24
104
0
22 May 2022
Knowledge Distillation with Deep Supervision
Shiya Luo
Defang Chen
Can Wang
35
1
0
16 Feb 2022
Exploring Inter-Channel Correlation for Diversity-preserved KnowledgeDistillation
Li Liu
Qingle Huang
Sihao Lin
Hongwei Xie
Bing Wang
Xiaojun Chang
Xiao-Xue Liang
30
100
0
08 Feb 2022
Auto-Transfer: Learning to Route Transferrable Representations
K. Murugesan
Vijay Sadashivaiah
Ronny Luss
Karthikeyan Shanmugam
Pin-Yu Chen
Amit Dhurandhar
AAML
56
5
0
02 Feb 2022
Oracle Teacher: Leveraging Target Information for Better Knowledge Distillation of CTC Models
J. Yoon
H. Kim
Hyeon Seung Lee
Sunghwan Ahn
N. Kim
46
1
0
05 Nov 2021
RAIL-KD: RAndom Intermediate Layer Mapping for Knowledge Distillation
Md. Akmal Haidar
Nithin Anchuri
Mehdi Rezagholizadeh
Abbas Ghaddar
Philippe Langlais
Pascal Poupart
47
22
0
21 Sep 2021
Knowledge Distillation with Noisy Labels for Natural Language Understanding
Shivendra Bhardwaj
Abbas Ghaddar
Ahmad Rashid
Khalil Bibi
Cheng-huan Li
A. Ghodsi
Philippe Langlais
Mehdi Rezagholizadeh
26
1
0
21 Sep 2021
Student Network Learning via Evolutionary Knowledge Distillation
Kangkai Zhang
Chunhui Zhang
Shikun Li
Dan Zeng
Shiming Ge
39
83
0
23 Mar 2021
1