Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2201.06945
Cited By
It's All in the Head: Representation Knowledge Distillation through Classifier Sharing
18 January 2022
Emanuel Ben-Baruch
M. Karklinsky
Yossi Biton
Avi Ben-Cohen
Hussam Lawen
Nadav Zamir
Re-assign community
ArXiv
PDF
HTML
Papers citing
"It's All in the Head: Representation Knowledge Distillation through Classifier Sharing"
6 / 6 papers shown
Title
CustomKD: Customizing Large Vision Foundation for Edge Model Improvement via Knowledge Distillation
Jungsoo Lee
Debasmit Das
Munawar Hayat
Sungha Choi
Kyuwoong Hwang
Fatih Porikli
VLM
68
1
0
23 Mar 2025
Quantifying Knowledge Distillation Using Partial Information Decomposition
Pasan Dissanayake
Faisal Hamman
Barproda Halder
Ilia Sucholutsky
Qiuyi Zhang
Sanghamitra Dutta
36
0
0
12 Nov 2024
AdaDistill: Adaptive Knowledge Distillation for Deep Face Recognition
Fadi Boutros
Vitomir Štruc
Naser Damer
44
2
0
01 Jul 2024
FedDr+: Stabilizing Dot-regression with Global Feature Distillation for Federated Learning
Seongyoon Kim
Minchan Jeong
Sungnyun Kim
Sungwoo Cho
Sumyeong Ahn
Se-Young Yun
FedML
47
0
0
04 Jun 2024
ImageNet-21K Pretraining for the Masses
T. Ridnik
Emanuel Ben-Baruch
Asaf Noy
Lihi Zelnik-Manor
SSeg
VLM
CLIP
179
686
0
22 Apr 2021
Learning Student-Friendly Teacher Networks for Knowledge Distillation
D. Park
Moonsu Cha
C. Jeong
Daesin Kim
Bohyung Han
118
100
0
12 Feb 2021
1