Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2312.17029
Cited By
FedSDD: Scalable and Diversity-enhanced Distillation for Model Aggregation in Federated Learning
28 December 2023
Ho Man Kwan
Shenghui Song
FedML
Re-assign community
ArXiv
PDF
HTML
Papers citing
"FedSDD: Scalable and Diversity-enhanced Distillation for Model Aggregation in Federated Learning"
2 / 2 papers shown
Title
A Field Guide to Federated Optimization
Jianyu Wang
Zachary B. Charles
Zheng Xu
Gauri Joshi
H. B. McMahan
...
Mi Zhang
Tong Zhang
Chunxiang Zheng
Chen Zhu
Wennan Zhu
FedML
187
412
0
14 Jul 2021
Large scale distributed neural network training through online distillation
Rohan Anil
Gabriel Pereyra
Alexandre Passos
Róbert Ormándi
George E. Dahl
Geoffrey E. Hinton
FedML
278
404
0
09 Apr 2018
1