ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1907.09643
  4. Cited By
Highlight Every Step: Knowledge Distillation via Collaborative Teaching

Highlight Every Step: Knowledge Distillation via Collaborative Teaching

23 July 2019
Haoran Zhao
Xin Sun
Junyu Dong
Changrui Chen
Zihe Dong
ArXivPDFHTML

Papers citing "Highlight Every Step: Knowledge Distillation via Collaborative Teaching"

8 / 8 papers shown
Title
PHI-S: Distribution Balancing for Label-Free Multi-Teacher Distillation
PHI-S: Distribution Balancing for Label-Free Multi-Teacher Distillation
Mike Ranzinger
Jon Barker
Greg Heinrich
Pavlo Molchanov
Bryan Catanzaro
Andrew Tao
42
5
0
02 Oct 2024
AM-RADIO: Agglomerative Vision Foundation Model -- Reduce All Domains
  Into One
AM-RADIO: Agglomerative Vision Foundation Model -- Reduce All Domains Into One
Michael Ranzinger
Greg Heinrich
Jan Kautz
Pavlo Molchanov
VLM
44
42
0
10 Dec 2023
Designing and Training of Lightweight Neural Networks on Edge Devices
  using Early Halting in Knowledge Distillation
Designing and Training of Lightweight Neural Networks on Edge Devices using Early Halting in Knowledge Distillation
Rahul Mishra
Hari Prabhat Gupta
40
8
0
30 Sep 2022
A Closer Look at Branch Classifiers of Multi-exit Architectures
A Closer Look at Branch Classifiers of Multi-exit Architectures
Shaohui Lin
Bo Ji
Rongrong Ji
Angela Yao
12
4
0
28 Apr 2022
Single-Layer Vision Transformers for More Accurate Early Exits with Less
  Overhead
Single-Layer Vision Transformers for More Accurate Early Exits with Less Overhead
Arian Bakhtiarnia
Qi Zhang
Alexandros Iosifidis
27
35
0
19 May 2021
Knowledge Distillation: A Survey
Knowledge Distillation: A Survey
Jianping Gou
B. Yu
Stephen J. Maybank
Dacheng Tao
VLM
19
2,843
0
09 Jun 2020
Preparing Lessons: Improve Knowledge Distillation with Better
  Supervision
Preparing Lessons: Improve Knowledge Distillation with Better Supervision
Tiancheng Wen
Shenqi Lai
Xueming Qian
25
67
0
18 Nov 2019
Neural Architecture Search with Reinforcement Learning
Neural Architecture Search with Reinforcement Learning
Barret Zoph
Quoc V. Le
271
5,329
0
05 Nov 2016
1