Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2207.12106
Cited By
Black-box Few-shot Knowledge Distillation
25 July 2022
Dang Nguyen
Sunil R. Gupta
Kien Do
Svetha Venkatesh
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Black-box Few-shot Knowledge Distillation"
10 / 10 papers shown
Title
Tiny models from tiny data: Textual and null-text inversion for few-shot distillation
Erik Landolsi
Fredrik Kahl
DiffM
93
1
0
05 Jun 2024
Distill on the Go: Online knowledge distillation in self-supervised learning
Prashant Shivaram Bhat
Elahe Arani
Bahram Zonooz
SSL
47
28
0
20 Apr 2021
Heterogeneous Knowledge Distillation using Information Flow Modeling
Nikolaos Passalis
Maria Tzelepi
Anastasios Tefas
58
139
0
02 May 2020
Dreaming to Distill: Data-free Knowledge Transfer via DeepInversion
Hongxu Yin
Pavlo Molchanov
Zhizhong Li
J. Álvarez
Arun Mallya
Derek Hoiem
N. Jha
Jan Kautz
66
565
0
18 Dec 2019
Contrastive Representation Distillation
Yonglong Tian
Dilip Krishnan
Phillip Isola
144
1,049
0
23 Oct 2019
Graph-based Knowledge Distillation by Multi-head Attention Network
Seunghyun Lee
B. Song
37
77
0
04 Jul 2019
Zero-Shot Knowledge Distillation in Deep Networks
Gaurav Kumar Nayak
Konda Reddy Mopuri
Vaisakh Shaj
R. Venkatesh Babu
Anirban Chakraborty
75
245
0
20 May 2019
Conditional Teacher-Student Learning
Zhong Meng
Jinyu Li
Yong Zhao
Jiawei Liu
56
90
0
28 Apr 2019
mixup: Beyond Empirical Risk Minimization
Hongyi Zhang
Moustapha Cissé
Yann N. Dauphin
David Lopez-Paz
NoLa
276
9,760
0
25 Oct 2017
FitNets: Hints for Thin Deep Nets
Adriana Romero
Nicolas Ballas
Samira Ebrahimi Kahou
Antoine Chassang
C. Gatta
Yoshua Bengio
FedML
303
3,883
0
19 Dec 2014
1