ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2505.17639
  4. Cited By
PreMoe: Lightening MoEs on Constrained Memory by Expert Pruning and Retrieval

PreMoe: Lightening MoEs on Constrained Memory by Expert Pruning and Retrieval

23 May 2025
Zehua Pei
Ying Zhang
Hui-Ling Zhen
Xianzhi Yu
Wulong Liu
Sinno Jialin Pan
Mingxuan Yuan
Bei Yu
    MoE
ArXiv (abs)PDFHTML

Papers citing "PreMoe: Lightening MoEs on Constrained Memory by Expert Pruning and Retrieval"

3 / 3 papers shown
Title
Pangu Ultra MoE: How to Train Your Big MoE on Ascend NPUs
Pangu Ultra MoE: How to Train Your Big MoE on Ascend NPUs
Yehui Tang
Yichun Yin
Yaoyuan Wang
Hang Zhou
Yu Pan
...
Zhe Liu
Zhicheng Liu
Zhuowen Tu
Zilin Ding
Zongyuan Zhan
MoE
113
2
0
07 May 2025
DeepSeek-R1: Incentivizing Reasoning Capability in LLMs via Reinforcement Learning
DeepSeek-R1: Incentivizing Reasoning Capability in LLMs via Reinforcement Learning
DeepSeek-AI
Daya Guo
Dejian Yang
Haowei Zhang
Junxiao Song
...
Shiyu Wang
S. Yu
Shunfeng Zhou
Shuting Pan
S.S. Li
ReLMVLMOffRLAI4TSLRM
395
2,028
0
22 Jan 2025
FuseGPT: Learnable Layers Fusion of Generative Pre-trained Transformers
FuseGPT: Learnable Layers Fusion of Generative Pre-trained Transformers
Zehua Pei
Hui-Ling Zhen
Xianzhi Yu
Sinno Jialin Pan
Mingxuan Yuan
Bei Yu
AI4CE
249
3
0
21 Nov 2024
1