ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2110.11023
  4. Cited By
Augmenting Knowledge Distillation With Peer-To-Peer Mutual Learning For
  Model Compression

Augmenting Knowledge Distillation With Peer-To-Peer Mutual Learning For Model Compression

21 October 2021
Usma Niyaz
Deepti R. Bathula
ArXivPDFHTML

Papers citing "Augmenting Knowledge Distillation With Peer-To-Peer Mutual Learning For Model Compression"

3 / 3 papers shown
Title
PeerAiD: Improving Adversarial Distillation from a Specialized Peer
  Tutor
PeerAiD: Improving Adversarial Distillation from a Specialized Peer Tutor
Jaewon Jung
Hongsun Jang
Jaeyong Song
Jinho Lee
OOD
AAML
181
4
0
11 Mar 2024
Leveraging Different Learning Styles for Improved Knowledge Distillation
  in Biomedical Imaging
Leveraging Different Learning Styles for Improved Knowledge Distillation in Biomedical Imaging
Usma Niyaz
A. Sambyal
Deepti R. Bathula
25
0
0
06 Dec 2022
Knowledge distillation with a class-aware loss for endoscopic disease
  detection
Knowledge distillation with a class-aware loss for endoscopic disease detection
P. E. Chavarrias-Solano
Mansoor Ali Teevno
Gilberto Ochoa-Ruiz
Sharib Ali
15
2
0
19 Jul 2022
1