Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2208.05642
Cited By
Self-Knowledge Distillation via Dropout
11 August 2022
Hyoje Lee
Yeachan Park
Hyun Seo
Myung-joo Kang
FedML
Re-assign community
ArXiv (abs)
PDF
HTML
Papers citing
"Self-Knowledge Distillation via Dropout"
1 / 1 papers shown
Title
ABKD: Pursuing a Proper Allocation of the Probability Mass in Knowledge Distillation via
α
α
α
-
β
β
β
-Divergence
Guanghui Wang
Zhiyong Yang
Ziyi Wang
Shi Wang
Qianqian Xu
Qingming Huang
299
0
0
07 May 2025
1