ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2107.00181
  4. Cited By
Revisiting Knowledge Distillation: An Inheritance and Exploration
  Framework

Revisiting Knowledge Distillation: An Inheritance and Exploration Framework

1 July 2021
Zhen Huang
Xu Shen
Jun Xing
Tongliang Liu
Xinmei Tian
Houqiang Li
Bing Deng
Jianqiang Huang
Xiansheng Hua
ArXivPDFHTML

Papers citing "Revisiting Knowledge Distillation: An Inheritance and Exploration Framework"

5 / 5 papers shown
Title
Distilling Universal and Joint Knowledge for Cross-Domain Model
  Compression on Time Series Data
Distilling Universal and Joint Knowledge for Cross-Domain Model Compression on Time Series Data
Qing Xu
Min-man Wu
Xiaoli Li
K. Mao
Zhenghua Chen
19
5
0
07 Jul 2023
Multi-Modality Distillation via Learning the teacher's modality-level
  Gram Matrix
Multi-Modality Distillation via Learning the teacher's modality-level Gram Matrix
Peng Liu
21
0
0
21 Dec 2021
Unsupervised Domain Adaptive Person Re-Identification via Human Learning
  Imitation
Unsupervised Domain Adaptive Person Re-Identification via Human Learning Imitation
Yang Peng
Ping Liu
Yawei Luo
Pan Zhou
Zichuan Xu
Jingen Liu
OOD
23
0
0
28 Nov 2021
Knowledge Distillation by On-the-Fly Native Ensemble
Knowledge Distillation by On-the-Fly Native Ensemble
Xu Lan
Xiatian Zhu
S. Gong
212
474
0
12 Jun 2018
On Large-Batch Training for Deep Learning: Generalization Gap and Sharp
  Minima
On Large-Batch Training for Deep Learning: Generalization Gap and Sharp Minima
N. Keskar
Dheevatsa Mudigere
J. Nocedal
M. Smelyanskiy
P. T. P. Tang
ODL
310
2,896
0
15 Sep 2016
1