ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2302.14703
  4. Cited By
Improving Expert Specialization in Mixture of Experts

Improving Expert Specialization in Mixture of Experts

28 February 2023
Yamuna Krishnamurthy
C. Watkins
Thomas Gaertner
    MoE
ArXivPDFHTML

Papers citing "Improving Expert Specialization in Mixture of Experts"

3 / 3 papers shown
Title
Flexible task abstractions emerge in linear networks with fast and bounded units
Flexible task abstractions emerge in linear networks with fast and bounded units
Kai Sandbrink
Jan P. Bauer
A. Proca
Andrew M. Saxe
Christopher Summerfield
Ali Hummos
63
2
0
17 Jan 2025
Mixture-of-Experts with Expert Choice Routing
Mixture-of-Experts with Expert Choice Routing
Yan-Quan Zhou
Tao Lei
Han-Chu Liu
Nan Du
Yanping Huang
Vincent Zhao
Andrew M. Dai
Zhifeng Chen
Quoc V. Le
James Laudon
MoE
160
327
0
18 Feb 2022
Beyond Distillation: Task-level Mixture-of-Experts for Efficient
  Inference
Beyond Distillation: Task-level Mixture-of-Experts for Efficient Inference
Sneha Kudugunta
Yanping Huang
Ankur Bapna
M. Krikun
Dmitry Lepikhin
Minh-Thang Luong
Orhan Firat
MoE
119
106
0
24 Sep 2021
1