ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2204.07028
  4. Cited By
Exploring the Distributed Knowledge Congruence in Proxy-data-free
  Federated Distillation

Exploring the Distributed Knowledge Congruence in Proxy-data-free Federated Distillation

14 April 2022
Zhiyuan Wu
Sheng Sun
Yuwei Wang
Min Liu
Quyang Pan
Junbo Zhang
Zeju Li
Qing Liu
    FedML
ArXivPDFHTML

Papers citing "Exploring the Distributed Knowledge Congruence in Proxy-data-free Federated Distillation"

6 / 6 papers shown
Title
Beyond Model Scale Limits: End-Edge-Cloud Federated Learning with Self-Rectified Knowledge Agglomeration
Zhiyuan Wu
Sheng Sun
Yuwei Wang
Min Liu
Ke Xu
Quyang Pan
Bo Gao
Tian Wen
FedML
30
0
0
03 Jan 2025
Knowledge Distillation in Federated Edge Learning: A Survey
Knowledge Distillation in Federated Edge Learning: A Survey
Zhiyuan Wu
Sheng Sun
Yuwei Wang
Min Liu
Xue Jiang
Runhan Li
Bo Gao
FedML
27
4
0
14 Jan 2023
FedICT: Federated Multi-task Distillation for Multi-access Edge Computing
FedICT: Federated Multi-task Distillation for Multi-access Edge Computing
Zhiyuan Wu
Sheng Sun
Yuwei Wang
Min Liu
Quyang Pan
Xue Jiang
Bo Gao
35
31
0
01 Jan 2023
FedML: A Research Library and Benchmark for Federated Machine Learning
FedML: A Research Library and Benchmark for Federated Machine Learning
Chaoyang He
Songze Li
Jinhyun So
Xiao Zeng
Mi Zhang
...
Yang Liu
Ramesh Raskar
Qiang Yang
M. Annavaram
Salman Avestimehr
FedML
168
564
0
27 Jul 2020
The Future of Digital Health with Federated Learning
The Future of Digital Health with Federated Learning
Nicola Rieke
Jonny Hancox
Wenqi Li
Fausto Milletari
H. Roth
...
Ronald M. Summers
Andrew Trask
Daguang Xu
Maximilian Baust
M. Jorge Cardoso
OOD
174
1,707
0
18 Mar 2020
Large scale distributed neural network training through online
  distillation
Large scale distributed neural network training through online distillation
Rohan Anil
Gabriel Pereyra
Alexandre Passos
Róbert Ormándi
George E. Dahl
Geoffrey E. Hinton
FedML
278
404
0
09 Apr 2018
1