ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1910.05057
  4. Cited By
Noise as a Resource for Learning in Knowledge Distillation

Noise as a Resource for Learning in Knowledge Distillation

11 October 2019
Elahe Arani
F. Sarfraz
Bahram Zonooz
ArXivPDFHTML

Papers citing "Noise as a Resource for Learning in Knowledge Distillation"

4 / 4 papers shown
Title
Can Model Compression Improve NLP Fairness
Can Model Compression Improve NLP Fairness
Guangxuan Xu
Qingyuan Hu
31
26
0
21 Jan 2022
An Empirical Analysis of the Impact of Data Augmentation on Knowledge
  Distillation
An Empirical Analysis of the Impact of Data Augmentation on Knowledge Distillation
Deepan Das
Haley Massa
Abhimanyu Kulkarni
Theodoros Rekatsinas
21
18
0
06 Jun 2020
Adversarial examples in the physical world
Adversarial examples in the physical world
Alexey Kurakin
Ian Goodfellow
Samy Bengio
SILM
AAML
287
5,842
0
08 Jul 2016
Dropout as a Bayesian Approximation: Representing Model Uncertainty in
  Deep Learning
Dropout as a Bayesian Approximation: Representing Model Uncertainty in Deep Learning
Y. Gal
Zoubin Ghahramani
UQCV
BDL
285
9,145
0
06 Jun 2015
1