Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2002.03936
Cited By
Subclass Distillation
10 February 2020
Rafael Müller
Simon Kornblith
Geoffrey E. Hinton
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Subclass Distillation"
10 / 10 papers shown
Title
Linear Projections of Teacher Embeddings for Few-Class Distillation
Noel Loo
Fotis Iliopoulos
Wei Hu
Erik Vee
30
0
0
30 Sep 2024
Heterogeneous Federated Learning Using Knowledge Codistillation
Jared Lichtarge
Ehsan Amid
Shankar Kumar
Tien-Ju Yang
Rohan Anil
Rajiv Mathews
FedML
39
0
0
04 Oct 2023
Supervision Complexity and its Role in Knowledge Distillation
Hrayr Harutyunyan
A. S. Rawat
A. Menon
Seungyeon Kim
Surinder Kumar
30
12
0
28 Jan 2023
Layerwise Bregman Representation Learning with Applications to Knowledge Distillation
Ehsan Amid
Rohan Anil
Christopher Fifty
Manfred K. Warmuth
28
2
0
15 Sep 2022
Transfer without Forgetting
Matteo Boschini
Lorenzo Bonicelli
Angelo Porrello
Giovanni Bellitto
M. Pennisi
S. Palazzo
C. Spampinato
Simone Calderara
CLL
22
46
0
01 Jun 2022
On the Efficiency of Subclass Knowledge Distillation in Classification Tasks
A. Sajedi
Konstantinos N. Plataniotis
16
4
0
12 Sep 2021
Learning from Weakly-labeled Web Videos via Exploring Sub-Concepts
Kunpeng Li
Zizhao Zhang
Guanhang Wu
Xuehan Xiong
Chen-Yu Lee
Zhichao Lu
Y. Fu
Tomas Pfister
34
5
0
11 Jan 2021
No Subclass Left Behind: Fine-Grained Robustness in Coarse-Grained Classification Problems
N. Sohoni
Jared A. Dunnmon
Geoffrey Angus
Albert Gu
Christopher Ré
30
242
0
25 Nov 2020
Anti-Distillation: Improving reproducibility of deep networks
G. Shamir
Lorenzo Coviello
42
20
0
19 Oct 2020
Large scale distributed neural network training through online distillation
Rohan Anil
Gabriel Pereyra
Alexandre Passos
Róbert Ormándi
George E. Dahl
Geoffrey E. Hinton
FedML
278
404
0
09 Apr 2018
1