ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2505.13345
  4. Cited By
Occult: Optimizing Collaborative Communication across Experts for Accelerated Parallel MoE Training and Inference

Occult: Optimizing Collaborative Communication across Experts for Accelerated Parallel MoE Training and Inference

19 May 2025
Shuqing Luo
Pingzhi Li
Jie Peng
Hanrui Wang
Yang
Zhao
Yu Cheng
Tianlong Chen
    MoE
ArXiv (abs)PDFHTML

Papers citing "Occult: Optimizing Collaborative Communication across Experts for Accelerated Parallel MoE Training and Inference"

1 / 1 papers shown
Title
MiniMax-01: Scaling Foundation Models with Lightning Attention
MiniMax-01: Scaling Foundation Models with Lightning Attention
MiniMax
Aonian Li
Bangwei Gong
Bo Yang
Bo Shen
...
Zhan Qin
Zhenhua Fan
Zhihang Yu
Z. L. Jiang
Zijia Wu
MoE
168
42
0
14 Jan 2025
1