Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2104.00352
Cited By
Decentralized and Model-Free Federated Learning: Consensus-Based Distillation in Function Space
1 April 2021
Akihito Taya
Takayuki Nishio
M. Morikura
Koji Yamamoto
FedML
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Decentralized and Model-Free Federated Learning: Consensus-Based Distillation in Function Space"
7 / 7 papers shown
Title
Soft-Label Caching and Sharpening for Communication-Efficient Federated Distillation
Kitsuya Azuma
Takayuki Nishio
Yuichi Kitagawa
Wakako Nakano
Takahito Tanimura
FedML
72
0
0
28 Apr 2025
Federated Learning with Label-Masking Distillation
Jianghu Lu
Shikun Li
Kexin Bao
Pengju Wang
Zhenxing Qian
Shiming Ge
FedML
47
10
0
20 Sep 2024
Multiple Access in the Era of Distributed Computing and Edge Intelligence
Nikos G. Evgenidis
Nikos A. Mitsiou
Vasiliki I. Koutsioumpa
Sotiris A. Tegos
P. Diamantoulakis
G. Karagiannidis
41
8
0
26 Feb 2024
Faster Convergence with Less Communication: Broadcast-Based Subgraph Sampling for Decentralized Learning over Wireless Networks
Daniel Pérez Herrera
Zheng Chen
Erik G. Larsson
26
1
0
24 Jan 2024
Knowledge Distillation in Federated Edge Learning: A Survey
Zhiyuan Wu
Sheng Sun
Yuwei Wang
Min Liu
Xue Jiang
Runhan Li
Bo Gao
FedML
35
4
0
14 Jan 2023
The Effect of Training Parameters and Mechanisms on Decentralized Federated Learning based on MNIST Dataset
Zhuofan Zhang
Mi Zhou
K. Niu
C. Abdallah
FedML
44
1
0
07 Aug 2021
Large scale distributed neural network training through online distillation
Rohan Anil
Gabriel Pereyra
Alexandre Passos
Róbert Ormándi
George E. Dahl
Geoffrey E. Hinton
FedML
278
404
0
09 Apr 2018
1