ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2104.00352
  4. Cited By
Decentralized and Model-Free Federated Learning: Consensus-Based
  Distillation in Function Space

Decentralized and Model-Free Federated Learning: Consensus-Based Distillation in Function Space

1 April 2021
Akihito Taya
Takayuki Nishio
M. Morikura
Koji Yamamoto
    FedML
ArXivPDFHTML

Papers citing "Decentralized and Model-Free Federated Learning: Consensus-Based Distillation in Function Space"

7 / 7 papers shown
Title
Soft-Label Caching and Sharpening for Communication-Efficient Federated Distillation
Soft-Label Caching and Sharpening for Communication-Efficient Federated Distillation
Kitsuya Azuma
Takayuki Nishio
Yuichi Kitagawa
Wakako Nakano
Takahito Tanimura
FedML
72
0
0
28 Apr 2025
Federated Learning with Label-Masking Distillation
Federated Learning with Label-Masking Distillation
Jianghu Lu
Shikun Li
Kexin Bao
Pengju Wang
Zhenxing Qian
Shiming Ge
FedML
47
10
0
20 Sep 2024
Multiple Access in the Era of Distributed Computing and Edge
  Intelligence
Multiple Access in the Era of Distributed Computing and Edge Intelligence
Nikos G. Evgenidis
Nikos A. Mitsiou
Vasiliki I. Koutsioumpa
Sotiris A. Tegos
P. Diamantoulakis
G. Karagiannidis
41
8
0
26 Feb 2024
Faster Convergence with Less Communication: Broadcast-Based Subgraph Sampling for Decentralized Learning over Wireless Networks
Faster Convergence with Less Communication: Broadcast-Based Subgraph Sampling for Decentralized Learning over Wireless Networks
Daniel Pérez Herrera
Zheng Chen
Erik G. Larsson
26
1
0
24 Jan 2024
Knowledge Distillation in Federated Edge Learning: A Survey
Knowledge Distillation in Federated Edge Learning: A Survey
Zhiyuan Wu
Sheng Sun
Yuwei Wang
Min Liu
Xue Jiang
Runhan Li
Bo Gao
FedML
35
4
0
14 Jan 2023
The Effect of Training Parameters and Mechanisms on Decentralized
  Federated Learning based on MNIST Dataset
The Effect of Training Parameters and Mechanisms on Decentralized Federated Learning based on MNIST Dataset
Zhuofan Zhang
Mi Zhou
K. Niu
C. Abdallah
FedML
44
1
0
07 Aug 2021
Large scale distributed neural network training through online
  distillation
Large scale distributed neural network training through online distillation
Rohan Anil
Gabriel Pereyra
Alexandre Passos
Róbert Ormándi
George E. Dahl
Geoffrey E. Hinton
FedML
278
404
0
09 Apr 2018
1