Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
1804.03235
Cited By
Large scale distributed neural network training through online distillation
9 April 2018
Rohan Anil
Gabriel Pereyra
Alexandre Passos
Róbert Ormándi
George E. Dahl
Geoffrey E. Hinton
FedML
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Large scale distributed neural network training through online distillation"
24 / 74 papers shown
Title
Dual-Teacher++: Exploiting Intra-domain and Inter-domain Knowledge with Reliable Transfer for Cardiac Segmentation
Kang Li
Shujun Wang
Lequan Yu
Pheng-Ann Heng
62
28
0
07 Jan 2021
Cross-Layer Distillation with Semantic Calibration
Defang Chen
Jian-Ping Mei
Yuan Zhang
Can Wang
Yan Feng
Chun-Yen Chen
FedML
39
286
0
06 Dec 2020
Heterogeneous Data-Aware Federated Learning
Lixuan Yang
Cedric Beliard
Dario Rossi
FedML
31
17
0
12 Nov 2020
Federated Knowledge Distillation
Hyowoon Seo
Jihong Park
Seungeun Oh
M. Bennis
Seong-Lyun Kim
FedML
25
90
0
04 Nov 2020
Anti-Distillation: Improving reproducibility of deep networks
G. Shamir
Lorenzo Coviello
39
20
0
19 Oct 2020
Communication-Efficient and Distributed Learning Over Wireless Networks: Principles and Applications
Jihong Park
S. Samarakoon
Anis Elgabli
Joongheon Kim
M. Bennis
Seong-Lyun Kim
Mérouane Debbah
28
161
0
06 Aug 2020
Dual-Teacher: Integrating Intra-domain and Inter-domain Teachers for Annotation-efficient Cardiac Segmentation
Kang Li
Shujun Wang
Lequan Yu
Pheng-Ann Heng
17
50
0
13 Jul 2020
Multiple Expert Brainstorming for Domain Adaptive Person Re-identification
Yunpeng Zhai
QiXiang Ye
Shijian Lu
Mengxi Jia
Rongrong Ji
Yonghong Tian
13
163
0
03 Jul 2020
Knowledge Distillation: A Survey
Jianping Gou
B. Yu
Stephen J. Maybank
Dacheng Tao
VLM
19
2,837
0
09 Jun 2020
Understanding and Improving Knowledge Distillation
Jiaxi Tang
Rakesh Shivanna
Zhe Zhao
Dong Lin
Anima Singh
Ed H. Chi
Sagar Jain
19
129
0
10 Feb 2020
Cooperative Learning via Federated Distillation over Fading Channels
Jinhyun Ahn
Osvaldo Simeone
Joonhyuk Kang
FedML
14
29
0
03 Feb 2020
Modeling Teacher-Student Techniques in Deep Neural Networks for Knowledge Distillation
Sajjad Abbasi
M. Hajabdollahi
N. Karimi
S. Samavi
10
28
0
31 Dec 2019
Knowledge Transfer Graph for Deep Collaborative Learning
Soma Minami
Tsubasa Hirakawa
Takayoshi Yamashita
H. Fujiyoshi
18
9
0
10 Sep 2019
Efficient Evaluation-Time Uncertainty Estimation by Improved Distillation
Erik Englesson
Hossein Azizpour
UQCV
14
7
0
12 Jun 2019
Feature Fusion for Online Mutual Knowledge Distillation
Jangho Kim
Minsung Hyun
Inseop Chung
Nojun Kwak
FedML
24
91
0
19 Apr 2019
End-to-End Speech Translation with Knowledge Distillation
Yuchen Liu
Hao Xiong
Zhongjun He
Jiajun Zhang
Hua-Hong Wu
Haifeng Wang
Chengqing Zong
19
151
0
17 Apr 2019
Correlation Congruence for Knowledge Distillation
Baoyun Peng
Xiao Jin
Jiaheng Liu
Shunfeng Zhou
Yichao Wu
Yu Liu
Dongsheng Li
Zhaoning Zhang
37
507
0
03 Apr 2019
Multilingual Neural Machine Translation with Knowledge Distillation
Xu Tan
Yi Ren
Di He
Tao Qin
Zhou Zhao
Tie-Yan Liu
16
248
0
27 Feb 2019
Accelerating Large Scale Knowledge Distillation via Dynamic Importance Sampling
Minghan Li
Tanli Zuo
Ruicheng Li
Martha White
Weishi Zheng
24
3
0
03 Dec 2018
Communication-Efficient On-Device Machine Learning: Federated Distillation and Augmentation under Non-IID Private Data
Eunjeong Jeong
Seungeun Oh
Hyesung Kim
Jihong Park
M. Bennis
Seong-Lyun Kim
FedML
17
589
0
28 Nov 2018
Ranking Distillation: Learning Compact Ranking Models With High Performance for Recommender System
Jiaxi Tang
Ke Wang
19
182
0
19 Sep 2018
Collaborative Learning for Deep Neural Networks
Guocong Song
Wei Chai
FedML
13
192
0
30 May 2018
Google's Neural Machine Translation System: Bridging the Gap between Human and Machine Translation
Yonghui Wu
M. Schuster
Z. Chen
Quoc V. Le
Mohammad Norouzi
...
Alex Rudnick
Oriol Vinyals
G. Corrado
Macduff Hughes
J. Dean
AIMat
716
6,743
0
26 Sep 2016
On Large-Batch Training for Deep Learning: Generalization Gap and Sharp Minima
N. Keskar
Dheevatsa Mudigere
J. Nocedal
M. Smelyanskiy
P. T. P. Tang
ODL
281
2,889
0
15 Sep 2016
Previous
1
2