ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2203.14649
  4. Cited By
Knowledge Distillation: Bad Models Can Be Good Role Models

Knowledge Distillation: Bad Models Can Be Good Role Models

28 March 2022
Gal Kaplun
Eran Malach
Preetum Nakkiran
Shai Shalev-Shwartz
    FedML
ArXivPDFHTML

Papers citing "Knowledge Distillation: Bad Models Can Be Good Role Models"

9 / 9 papers shown
Title
Advancing Compressed Video Action Recognition through Progressive
  Knowledge Distillation
Advancing Compressed Video Action Recognition through Progressive Knowledge Distillation
Efstathia Soufleri
Deepak Ravikumar
Kaushik Roy
26
1
0
02 Jul 2024
$\textit{Trans-LoRA}$: towards data-free Transferable Parameter
  Efficient Finetuning
Trans-LoRA\textit{Trans-LoRA}Trans-LoRA: towards data-free Transferable Parameter Efficient Finetuning
Runqian Wang
Soumya Ghosh
David D. Cox
Diego Antognini
Aude Oliva
Rogerio Feris
Leonid Karlinsky
42
1
0
27 May 2024
SeqNAS: Neural Architecture Search for Event Sequence Classification
SeqNAS: Neural Architecture Search for Event Sequence Classification
Igor Udovichenko
Egor Shvetsov
Denis Divitsky
Dmitry Osin
I. Trofimov
Anatoliy Glushenko
I. Sukharev
Dmitry Berestnev
E. Burnaev
23
6
0
06 Jan 2024
Bayesian Optimization Meets Self-Distillation
Bayesian Optimization Meets Self-Distillation
HyunJae Lee
Heon Song
Hyeonsoo Lee
Gi-hyeon Lee
Suyeong Park
Donggeun Yoo
UQCV
BDL
41
1
0
25 Apr 2023
Robust Knowledge Distillation from RNN-T Models With Noisy Training
  Labels Using Full-Sum Loss
Robust Knowledge Distillation from RNN-T Models With Noisy Training Labels Using Full-Sum Loss
Mohammad Zeineldeen
Kartik Audhkhasi
M. Baskar
Bhuvana Ramabhadran
24
2
0
10 Mar 2023
Understanding Self-Distillation in the Presence of Label Noise
Understanding Self-Distillation in the Presence of Label Noise
Rudrajit Das
Sujay Sanghavi
41
15
0
30 Jan 2023
On student-teacher deviations in distillation: does it pay to disobey?
On student-teacher deviations in distillation: does it pay to disobey?
Vaishnavh Nagarajan
A. Menon
Srinadh Bhojanapalli
H. Mobahi
Surinder Kumar
43
9
0
30 Jan 2023
Reduce, Reuse, Recycle: Improving Training Efficiency with Distillation
Reduce, Reuse, Recycle: Improving Training Efficiency with Distillation
Cody Blakeney
Jessica Zosa Forde
Jonathan Frankle
Ziliang Zong
Matthew L. Leavitt
VLM
30
4
0
01 Nov 2022
Meta Pseudo Labels
Meta Pseudo Labels
Hieu H. Pham
Zihang Dai
Qizhe Xie
Minh-Thang Luong
Quoc V. Le
VLM
262
658
0
23 Mar 2020
1