Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2110.11023
Cited By
Augmenting Knowledge Distillation With Peer-To-Peer Mutual Learning For Model Compression
21 October 2021
Usma Niyaz
Deepti R. Bathula
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Augmenting Knowledge Distillation With Peer-To-Peer Mutual Learning For Model Compression"
3 / 3 papers shown
Title
PeerAiD: Improving Adversarial Distillation from a Specialized Peer Tutor
Jaewon Jung
Hongsun Jang
Jaeyong Song
Jinho Lee
OOD
AAML
181
4
0
11 Mar 2024
Leveraging Different Learning Styles for Improved Knowledge Distillation in Biomedical Imaging
Usma Niyaz
A. Sambyal
Deepti R. Bathula
25
0
0
06 Dec 2022
Knowledge distillation with a class-aware loss for endoscopic disease detection
P. E. Chavarrias-Solano
Mansoor Ali Teevno
Gilberto Ochoa-Ruiz
Sharib Ali
15
2
0
19 Jul 2022
1