ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2204.05547
  4. Cited By
DistPro: Searching A Fast Knowledge Distillation Process via Meta
  Optimization

DistPro: Searching A Fast Knowledge Distillation Process via Meta Optimization

12 April 2022
XueQing Deng
Dawei Sun
Shawn D. Newsam
Peng Wang
ArXiv (abs)PDFHTML

Papers citing "DistPro: Searching A Fast Knowledge Distillation Process via Meta Optimization"

6 / 6 papers shown
Title
Contrastive Representation Distillation via Multi-Scale Feature Decoupling
Contrastive Representation Distillation via Multi-Scale Feature Decoupling
Cuipeng Wang
Tieyuan Chen
Haipeng Wang
107
0
0
09 Feb 2025
Rethinking Knowledge in Distillation: An In-context Sample Retrieval Perspective
Rethinking Knowledge in Distillation: An In-context Sample Retrieval Perspective
Jinjing Zhu
Songze Li
Lin Wang
103
0
0
13 Jan 2025
Adaptive Teaching with Shared Classifier for Knowledge Distillation
Adaptive Teaching with Shared Classifier for Knowledge Distillation
Jaeyeon Jang
Young-Ik Kim
Jisu Lim
Hyeonseong Lee
52
0
0
12 Jun 2024
NORM: Knowledge Distillation via N-to-One Representation Matching
NORM: Knowledge Distillation via N-to-One Representation Matching
Xiaolong Liu
Lujun Li
Chao Li
Anbang Yao
116
71
0
23 May 2023
PURSUhInT: In Search of Informative Hint Points Based on Layer
  Clustering for Knowledge Distillation
PURSUhInT: In Search of Informative Hint Points Based on Layer Clustering for Knowledge Distillation
Reyhan Kevser Keser
Aydin Ayanzadeh
O. A. Aghdam
Çaglar Kilcioglu
B. U. Toreyin
N. K. Üre
62
7
0
26 Feb 2021
Contrastive Representation Distillation
Contrastive Representation Distillation
Yonglong Tian
Dilip Krishnan
Phillip Isola
275
1,059
0
23 Oct 2019
1