ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2111.03555
  4. Cited By
AUTOKD: Automatic Knowledge Distillation Into A Student Architecture
  Family

AUTOKD: Automatic Knowledge Distillation Into A Student Architecture Family

5 November 2021
Roy Henha Eyono
Fabio Maria Carlucci
P. Esperança
Binxin Ru
Phillip Torr
ArXiv (abs)PDFHTML

Papers citing "AUTOKD: Automatic Knowledge Distillation Into A Student Architecture Family"

4 / 4 papers shown
Title
Generalizing Teacher Networks for Effective Knowledge Distillation Across Student Architectures
Generalizing Teacher Networks for Effective Knowledge Distillation Across Student Architectures
Kuluhan Binici
Weiming Wu
Tulika Mitra
54
1
0
22 Jul 2024
Design Automation for Fast, Lightweight, and Effective Deep Learning
  Models: A Survey
Design Automation for Fast, Lightweight, and Effective Deep Learning Models: A Survey
Dalin Zhang
Kaixuan Chen
Yan Zhao
B. Yang
Li-Ping Yao
Christian S. Jensen
116
3
0
22 Aug 2022
A Novel Architecture Slimming Method for Network Pruning and Knowledge
  Distillation
A Novel Architecture Slimming Method for Network Pruning and Knowledge Distillation
Dongqi Wang
Shengyu Zhang
Zhipeng Di
Xin Lin
Weihua Zhou
Leilei Gan
61
0
0
21 Feb 2022
Contrastive Representation Distillation
Contrastive Representation Distillation
Yonglong Tian
Dilip Krishnan
Phillip Isola
201
1,057
0
23 Oct 2019
1