Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2208.05642
Cited By
Self-Knowledge Distillation via Dropout
11 August 2022
Hyoje Lee
Yeachan Park
Hyun Seo
Myung-joo Kang
FedML
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Self-Knowledge Distillation via Dropout"
4 / 4 papers shown
Title
ABKD: Pursuing a Proper Allocation of the Probability Mass in Knowledge Distillation via
α
α
α
-
β
β
β
-Divergence
Guanghui Wang
Zhiyong Yang
Zhilin Wang
Shi Wang
Qianqian Xu
Qingming Huang
44
0
0
07 May 2025
MobileNets: Efficient Convolutional Neural Networks for Mobile Vision Applications
Andrew G. Howard
Menglong Zhu
Bo Chen
Dmitry Kalenichenko
Weijun Wang
Tobias Weyand
M. Andreetto
Hartwig Adam
3DH
950
20,599
0
17 Apr 2017
Mean teachers are better role models: Weight-averaged consistency targets improve semi-supervised deep learning results
Antti Tarvainen
Harri Valpola
OOD
MoMe
273
1,275
0
06 Mar 2017
Densely Connected Convolutional Networks
Gao Huang
Zhuang Liu
Laurens van der Maaten
Kilian Q. Weinberger
PINN
3DV
333
36,420
0
25 Aug 2016
1