Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2006.01683
Cited By
Channel Distillation: Channel-Wise Attention for Knowledge Distillation
2 June 2020
Zaida Zhou
Chaoran Zhuge
Xinwei Guan
Wen Liu
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Channel Distillation: Channel-Wise Attention for Knowledge Distillation"
5 / 5 papers shown
Title
Data-free Knowledge Distillation for Fine-grained Visual Categorization
Renrong Shao
Wei Zhang
Jianhua Yin
Jun Wang
39
2
0
18 Apr 2024
Attention-guided Feature Distillation for Semantic Segmentation
Amir M. Mansourian
Arya Jalali
Rozhan Ahmadi
S. Kasaei
36
0
0
08 Mar 2024
Student-friendly Knowledge Distillation
Mengyang Yuan
Bo Lang
Fengnan Quan
20
17
0
18 May 2023
Multi scale Feature Extraction and Fusion for Online Knowledge Distillation
Panpan Zou
Yinglei Teng
Tao Niu
37
3
0
16 Jun 2022
Teacher's pet: understanding and mitigating biases in distillation
Michal Lukasik
Srinadh Bhojanapalli
A. Menon
Sanjiv Kumar
18
25
0
19 Jun 2021
1