Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
1907.09643
Cited By
Highlight Every Step: Knowledge Distillation via Collaborative Teaching
23 July 2019
Haoran Zhao
Xin Sun
Junyu Dong
Changrui Chen
Zihe Dong
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Highlight Every Step: Knowledge Distillation via Collaborative Teaching"
8 / 8 papers shown
Title
PHI-S: Distribution Balancing for Label-Free Multi-Teacher Distillation
Mike Ranzinger
Jon Barker
Greg Heinrich
Pavlo Molchanov
Bryan Catanzaro
Andrew Tao
42
5
0
02 Oct 2024
AM-RADIO: Agglomerative Vision Foundation Model -- Reduce All Domains Into One
Michael Ranzinger
Greg Heinrich
Jan Kautz
Pavlo Molchanov
VLM
44
42
0
10 Dec 2023
Designing and Training of Lightweight Neural Networks on Edge Devices using Early Halting in Knowledge Distillation
Rahul Mishra
Hari Prabhat Gupta
40
8
0
30 Sep 2022
A Closer Look at Branch Classifiers of Multi-exit Architectures
Shaohui Lin
Bo Ji
Rongrong Ji
Angela Yao
12
4
0
28 Apr 2022
Single-Layer Vision Transformers for More Accurate Early Exits with Less Overhead
Arian Bakhtiarnia
Qi Zhang
Alexandros Iosifidis
27
35
0
19 May 2021
Knowledge Distillation: A Survey
Jianping Gou
B. Yu
Stephen J. Maybank
Dacheng Tao
VLM
19
2,843
0
09 Jun 2020
Preparing Lessons: Improve Knowledge Distillation with Better Supervision
Tiancheng Wen
Shenqi Lai
Xueming Qian
25
67
0
18 Nov 2019
Neural Architecture Search with Reinforcement Learning
Barret Zoph
Quoc V. Le
271
5,329
0
05 Nov 2016
1