ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2201.06945
  4. Cited By
It's All in the Head: Representation Knowledge Distillation through
  Classifier Sharing

It's All in the Head: Representation Knowledge Distillation through Classifier Sharing

18 January 2022
Emanuel Ben-Baruch
M. Karklinsky
Yossi Biton
Avi Ben-Cohen
Hussam Lawen
Nadav Zamir
ArXivPDFHTML

Papers citing "It's All in the Head: Representation Knowledge Distillation through Classifier Sharing"

6 / 6 papers shown
Title
CustomKD: Customizing Large Vision Foundation for Edge Model Improvement via Knowledge Distillation
CustomKD: Customizing Large Vision Foundation for Edge Model Improvement via Knowledge Distillation
Jungsoo Lee
Debasmit Das
Munawar Hayat
Sungha Choi
Kyuwoong Hwang
Fatih Porikli
VLM
68
1
0
23 Mar 2025
Quantifying Knowledge Distillation Using Partial Information Decomposition
Quantifying Knowledge Distillation Using Partial Information Decomposition
Pasan Dissanayake
Faisal Hamman
Barproda Halder
Ilia Sucholutsky
Qiuyi Zhang
Sanghamitra Dutta
36
0
0
12 Nov 2024
AdaDistill: Adaptive Knowledge Distillation for Deep Face Recognition
AdaDistill: Adaptive Knowledge Distillation for Deep Face Recognition
Fadi Boutros
Vitomir Štruc
Naser Damer
44
2
0
01 Jul 2024
FedDr+: Stabilizing Dot-regression with Global Feature Distillation for
  Federated Learning
FedDr+: Stabilizing Dot-regression with Global Feature Distillation for Federated Learning
Seongyoon Kim
Minchan Jeong
Sungnyun Kim
Sungwoo Cho
Sumyeong Ahn
Se-Young Yun
FedML
47
0
0
04 Jun 2024
ImageNet-21K Pretraining for the Masses
ImageNet-21K Pretraining for the Masses
T. Ridnik
Emanuel Ben-Baruch
Asaf Noy
Lihi Zelnik-Manor
SSeg
VLM
CLIP
179
686
0
22 Apr 2021
Learning Student-Friendly Teacher Networks for Knowledge Distillation
Learning Student-Friendly Teacher Networks for Knowledge Distillation
D. Park
Moonsu Cha
C. Jeong
Daesin Kim
Bohyung Han
118
100
0
12 Feb 2021
1