Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2408.11304
Cited By
FedMoE: Personalized Federated Learning via Heterogeneous Mixture of Experts
21 August 2024
Hanzi Mei
Dongqi Cai
Ao Zhou
Shangguang Wang
Mengwei Xu
MoE
Re-assign community
ArXiv
PDF
HTML
Papers citing
"FedMoE: Personalized Federated Learning via Heterogeneous Mixture of Experts"
5 / 5 papers shown
Title
FedADP: Unified Model Aggregation for Federated Learning with Heterogeneous Model Architectures
Jiacheng Wang
Hongtao Lv
Lei Liu
FedML
25
0
0
10 May 2025
MoQa: Rethinking MoE Quantization with Multi-stage Data-model Distribution Awareness
Zihao Zheng
Xiuping Cui
Size Zheng
Maoliang Li
Jiayu Chen
Yun Liang
Xiang Chen
MQ
MoE
69
0
0
27 Mar 2025
Personalized Federated Fine-Tuning for LLMs via Data-Driven Heterogeneous Model Architectures
Yicheng Zhang
Zhen Qin
Zhaomin Wu
Jian Hou
Shuiguang Deng
80
2
0
28 Nov 2024
Scalable and Efficient MoE Training for Multitask Multilingual Models
Young Jin Kim
A. A. Awan
Alexandre Muzio
Andres Felipe Cruz Salinas
Liyang Lu
Amr Hendy
Samyam Rajbhandari
Yuxiong He
Hany Awadalla
MoE
104
84
0
22 Sep 2021
Scaling Laws for Neural Language Models
Jared Kaplan
Sam McCandlish
T. Henighan
Tom B. Brown
B. Chess
R. Child
Scott Gray
Alec Radford
Jeff Wu
Dario Amodei
264
4,505
0
23 Jan 2020
1