ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2312.17029
  4. Cited By
FedSDD: Scalable and Diversity-enhanced Distillation for Model
  Aggregation in Federated Learning

FedSDD: Scalable and Diversity-enhanced Distillation for Model Aggregation in Federated Learning

28 December 2023
Ho Man Kwan
Shenghui Song
    FedML
ArXivPDFHTML

Papers citing "FedSDD: Scalable and Diversity-enhanced Distillation for Model Aggregation in Federated Learning"

2 / 2 papers shown
Title
A Field Guide to Federated Optimization
A Field Guide to Federated Optimization
Jianyu Wang
Zachary B. Charles
Zheng Xu
Gauri Joshi
H. B. McMahan
...
Mi Zhang
Tong Zhang
Chunxiang Zheng
Chen Zhu
Wennan Zhu
FedML
187
412
0
14 Jul 2021
Large scale distributed neural network training through online
  distillation
Large scale distributed neural network training through online distillation
Rohan Anil
Gabriel Pereyra
Alexandre Passos
Róbert Ormándi
George E. Dahl
Geoffrey E. Hinton
FedML
278
404
0
09 Apr 2018
1