Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2111.03555
Cited By
AUTOKD: Automatic Knowledge Distillation Into A Student Architecture Family
5 November 2021
Roy Henha Eyono
Fabio Maria Carlucci
P. Esperança
Binxin Ru
Phillip Torr
Re-assign community
ArXiv (abs)
PDF
HTML
Papers citing
"AUTOKD: Automatic Knowledge Distillation Into A Student Architecture Family"
4 / 4 papers shown
Title
Generalizing Teacher Networks for Effective Knowledge Distillation Across Student Architectures
Kuluhan Binici
Weiming Wu
Tulika Mitra
54
1
0
22 Jul 2024
Design Automation for Fast, Lightweight, and Effective Deep Learning Models: A Survey
Dalin Zhang
Kaixuan Chen
Yan Zhao
B. Yang
Li-Ping Yao
Christian S. Jensen
116
3
0
22 Aug 2022
A Novel Architecture Slimming Method for Network Pruning and Knowledge Distillation
Dongqi Wang
Shengyu Zhang
Zhipeng Di
Xin Lin
Weihua Zhou
Leilei Gan
61
0
0
21 Feb 2022
Contrastive Representation Distillation
Yonglong Tian
Dilip Krishnan
Phillip Isola
201
1,057
0
23 Oct 2019
1