Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2306.04073
Cited By
Patch-level Routing in Mixture-of-Experts is Provably Sample-efficient for Convolutional Neural Networks
7 June 2023
Mohammed Nowaz Rabbani Chowdhury
Shuai Zhang
Ming Wang
Sijia Liu
Pin-Yu Chen
MoE
Re-assign community
ArXiv (abs)
PDF
HTML
Papers citing
"Patch-level Routing in Mixture-of-Experts is Provably Sample-efficient for Convolutional Neural Networks"
5 / 5 papers shown
Title
Learning Soft Sparse Shapes for Efficient Time-Series Classification
Zhen Liu
Yicheng Luo
Yangqiu Song
Emadeldeen Eldele
Min-man Wu
Qianli Ma
AI4TS
130
0
0
11 May 2025
Backdoor Attacks Against Patch-based Mixture of Experts
Cedric Chan
Jona te Lintelo
S. Picek
AAML
MoE
439
0
0
03 May 2025
A Comprehensive Survey of Mixture-of-Experts: Algorithms, Theory, and Applications
Siyuan Mu
Sen Lin
MoE
491
5
0
10 Mar 2025
Filtered not Mixed: Stochastic Filtering-Based Online Gating for Mixture of Large Language Models
Raeid Saqur
Anastasis Kratsios
Florian Krach
Yannick Limmer
Jacob-Junqi Tian
John Willes
Blanka Horvath
Frank Rudzicz
MoE
150
0
0
24 Feb 2025
LocMoE: A Low-Overhead MoE for Large Language Model Training
Jing Li
Zhijie Sun
Xuan He
Li Zeng
Yi Lin
Entong Li
Binfan Zheng
Rongqian Zhao
Xin Chen
MoE
140
13
0
25 Jan 2024
1