Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2409.06669
Cited By
DA-MoE: Towards Dynamic Expert Allocation for Mixture-of-Experts Models
10 September 2024
Maryam Akhavan Aghdam
Hongpeng Jin
Yanzhao Wu
MoE
Re-assign community
ArXiv
PDF
HTML
Papers citing
"DA-MoE: Towards Dynamic Expert Allocation for Mixture-of-Experts Models"
3 / 3 papers shown
Title
MoQa: Rethinking MoE Quantization with Multi-stage Data-model Distribution Awareness
Zihao Zheng
Xiuping Cui
Size Zheng
Maoliang Li
Jiayu Chen
Yun Liang
Xiang Chen
MQ
MoE
64
0
0
27 Mar 2025
Mixture-of-Experts with Expert Choice Routing
Yan-Quan Zhou
Tao Lei
Han-Chu Liu
Nan Du
Yanping Huang
Vincent Zhao
Andrew M. Dai
Zhifeng Chen
Quoc V. Le
James Laudon
MoE
160
327
0
18 Feb 2022
Scaling Laws for Neural Language Models
Jared Kaplan
Sam McCandlish
T. Henighan
Tom B. Brown
B. Chess
R. Child
Scott Gray
Alec Radford
Jeff Wu
Dario Amodei
258
4,489
0
23 Jan 2020
1