ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1802.07417
  4. Cited By
Breaking the gridlock in Mixture-of-Experts: Consistent and Efficient
  Algorithms

Breaking the gridlock in Mixture-of-Experts: Consistent and Efficient Algorithms

21 February 2018
Ashok Vardhan Makkuva
Sewoong Oh
Sreeram Kannan
Pramod Viswanath
    MoE
ArXivPDFHTML

Papers citing "Breaking the gridlock in Mixture-of-Experts: Consistent and Efficient Algorithms"

5 / 5 papers shown
Title
SQ Lower Bounds for Learning Mixtures of Linear Classifiers
SQ Lower Bounds for Learning Mixtures of Linear Classifiers
Ilias Diakonikolas
D. Kane
Yuxin Sun
25
3
0
18 Oct 2023
Uniform Consistency in Nonparametric Mixture Models
Uniform Consistency in Nonparametric Mixture Models
Bryon Aragam
Ruiyi Yang
26
6
0
31 Aug 2021
On component interactions in two-stage recommender systems
On component interactions in two-stage recommender systems
Jiri Hron
K. Krauth
Michael I. Jordan
Niki Kilbertus
CML
LRM
40
31
0
28 Jun 2021
Convergence Rates for Gaussian Mixtures of Experts
Convergence Rates for Gaussian Mixtures of Experts
Nhat Ho
Chiao-Yu Yang
Michael I. Jordan
21
40
0
09 Jul 2019
Effective Approaches to Attention-based Neural Machine Translation
Effective Approaches to Attention-based Neural Machine Translation
Thang Luong
Hieu H. Pham
Christopher D. Manning
218
7,926
0
17 Aug 2015
1