Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
1909.08097
Cited By
Ensemble Knowledge Distillation for Learning Improved and Efficient Networks
17 September 2019
Umar Asif
Jianbin Tang
S. Harrer
FedML
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Ensemble Knowledge Distillation for Learning Improved and Efficient Networks"
16 / 16 papers shown
Title
Corrected with the Latest Version: Make Robust Asynchronous Federated Learning Possible
Chaoyi Lu
Yiding Sun
Pengbo Li
Zhichuan Yang
FedML
44
0
0
05 Apr 2025
PHI-S: Distribution Balancing for Label-Free Multi-Teacher Distillation
Mike Ranzinger
Jon Barker
Greg Heinrich
Pavlo Molchanov
Bryan Catanzaro
Andrew Tao
49
5
0
02 Oct 2024
AM-RADIO: Agglomerative Vision Foundation Model -- Reduce All Domains Into One
Michael Ranzinger
Greg Heinrich
Jan Kautz
Pavlo Molchanov
VLM
49
42
0
10 Dec 2023
Multimodal Distillation for Egocentric Action Recognition
Gorjan Radevski
Dusan Grujicic
Marie-Francine Moens
Matthew Blaschko
Tinne Tuytelaars
EgoV
35
23
0
14 Jul 2023
EnSiam: Self-Supervised Learning With Ensemble Representations
Kai Han
Minsik Lee
SSL
26
0
0
22 May 2023
FSNet: Redesign Self-Supervised MonoDepth for Full-Scale Depth Prediction for Autonomous Driving
Yuxuan Liu
Zhenhua Xu
Huaiyang Huang
Lujia Wang
Ming-Yu Liu
MDE
51
3
0
21 Apr 2023
Knowledge Distillation for Efficient Sequences of Training Runs
Xingyu Liu
A. Leonardi
Lu Yu
Chris Gilmer-Hill
Matthew L. Leavitt
Jonathan Frankle
24
4
0
11 Mar 2023
End-to-end Ensemble-based Feature Selection for Paralinguistics Tasks
Tamás Grósz
Mittul Singh
Sudarsana Reddy Kadiri
H. Kathania
M. Kurimo
28
0
0
28 Oct 2022
Federated Learning with Privacy-Preserving Ensemble Attention Distillation
Xuan Gong
Liangchen Song
Rishi Vedula
Abhishek Sharma
Meng Zheng
...
Arun Innanje
Terrence Chen
Junsong Yuan
David Doermann
Ziyan Wu
FedML
30
27
0
16 Oct 2022
Label driven Knowledge Distillation for Federated Learning with non-IID Data
Minh-Duong Nguyen
Viet Quoc Pham
D. Hoang
Long Tran-Thanh
Diep N. Nguyen
W. Hwang
28
2
0
29 Sep 2022
Preserving Privacy in Federated Learning with Ensemble Cross-Domain Knowledge Distillation
Xuan Gong
Abhishek Sharma
Srikrishna Karanam
Ziyan Wu
Terrence Chen
David Doermann
Arun Innanje
FedML
30
70
0
10 Sep 2022
Enhancing Heterogeneous Federated Learning with Knowledge Extraction and Multi-Model Fusion
Duy Phuong Nguyen
Sixing Yu
J. P. Muñoz
Ali Jannesari
FedML
21
12
0
16 Aug 2022
ParaDiS: Parallelly Distributable Slimmable Neural Networks
A. Ozerov
Anne Lambert
S. Kumaraswamy
UQCV
MoE
37
1
0
06 Oct 2021
Students are the Best Teacher: Exit-Ensemble Distillation with Multi-Exits
Hojung Lee
Jong-Seok Lee
22
8
0
01 Apr 2021
A Comprehensive Survey on Hardware-Aware Neural Architecture Search
Hadjer Benmeziane
Kaoutar El Maghraoui
Hamza Ouarnoughi
Smail Niar
Martin Wistuba
Naigang Wang
34
98
0
22 Jan 2021
Knowledge Distillation: A Survey
Jianping Gou
B. Yu
Stephen J. Maybank
Dacheng Tao
VLM
28
2,857
0
09 Jun 2020
1