Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
1912.00350
Cited By
Online Knowledge Distillation with Diverse Peers
1 December 2019
Defang Chen
Jian-Ping Mei
Can Wang
Yan Feng
Chun-Yen Chen
FedML
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Online Knowledge Distillation with Diverse Peers"
1 / 51 papers shown
Title
On Large-Batch Training for Deep Learning: Generalization Gap and Sharp Minima
N. Keskar
Dheevatsa Mudigere
J. Nocedal
M. Smelyanskiy
P. T. P. Tang
ODL
310
2,896
0
15 Sep 2016
Previous
1
2