Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
1910.05057
Cited By
Noise as a Resource for Learning in Knowledge Distillation
11 October 2019
Elahe Arani
F. Sarfraz
Bahram Zonooz
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Noise as a Resource for Learning in Knowledge Distillation"
4 / 4 papers shown
Title
Can Model Compression Improve NLP Fairness
Guangxuan Xu
Qingyuan Hu
31
26
0
21 Jan 2022
An Empirical Analysis of the Impact of Data Augmentation on Knowledge Distillation
Deepan Das
Haley Massa
Abhimanyu Kulkarni
Theodoros Rekatsinas
21
18
0
06 Jun 2020
Adversarial examples in the physical world
Alexey Kurakin
Ian Goodfellow
Samy Bengio
SILM
AAML
287
5,842
0
08 Jul 2016
Dropout as a Bayesian Approximation: Representing Model Uncertainty in Deep Learning
Y. Gal
Zoubin Ghahramani
UQCV
BDL
285
9,145
0
06 Jun 2015
1