ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2007.01922
  4. Cited By
Knowledge Distillation Beyond Model Compression

Knowledge Distillation Beyond Model Compression

3 July 2020
F. Sarfraz
Elahe Arani
Bahram Zonooz
ArXivPDFHTML

Papers citing "Knowledge Distillation Beyond Model Compression"

5 / 5 papers shown
Title
Rotation Invariant Quantization for Model Compression
Rotation Invariant Quantization for Model Compression
Dor-Joseph Kampeas
Yury Nahshan
Hanoch Kremer
Gil Lederman
Shira Zaloshinski
Zheng Li
E. Haleva
MQ
23
0
0
03 Mar 2023
LILA-BOTI : Leveraging Isolated Letter Accumulations By Ordering Teacher
  Insights for Bangla Handwriting Recognition
LILA-BOTI : Leveraging Isolated Letter Accumulations By Ordering Teacher Insights for Bangla Handwriting Recognition
Md. Ismail Hossain
Mohammed Rakib
Sabbir Mollah
Fuad Rahman
Nabeel Mohammed
29
6
0
23 May 2022
Self-Distillation from the Last Mini-Batch for Consistency
  Regularization
Self-Distillation from the Last Mini-Batch for Consistency Regularization
Yiqing Shen
Liwu Xu
Yuzhe Yang
Yaqian Li
Yandong Guo
15
62
0
30 Mar 2022
Distill on the Go: Online knowledge distillation in self-supervised
  learning
Distill on the Go: Online knowledge distillation in self-supervised learning
Prashant Shivaram Bhat
Elahe Arani
Bahram Zonooz
SSL
22
28
0
20 Apr 2021
Knowledge Distillation by On-the-Fly Native Ensemble
Knowledge Distillation by On-the-Fly Native Ensemble
Xu Lan
Xiatian Zhu
S. Gong
206
473
0
12 Jun 2018
1