Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2212.08066
Cited By
Mod-Squad: Designing Mixture of Experts As Modular Multi-Task Learners
15 December 2022
Zitian Chen
Songlin Yang
Mingyu Ding
Zhenfang Chen
Hengshuang Zhao
E. Learned-Miller
Chuang Gan
MoE
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Mod-Squad: Designing Mixture of Experts As Modular Multi-Task Learners"
6 / 6 papers shown
Title
Towards Modular LLMs by Building and Reusing a Library of LoRAs
O. Ostapenko
Zhan Su
E. Ponti
Laurent Charlin
Nicolas Le Roux
Matheus Pereira
Lucas Caccia
Alessandro Sordoni
MoMe
44
31
0
18 May 2024
SILO Language Models: Isolating Legal Risk In a Nonparametric Datastore
Sewon Min
Suchin Gururangan
Eric Wallace
Hannaneh Hajishirzi
Noah A. Smith
Luke Zettlemoyer
AILaw
22
63
0
08 Aug 2023
TaskExpert: Dynamically Assembling Multi-Task Representations with Memorial Mixture-of-Experts
Hanrong Ye
Dan Xu
MoE
42
26
0
28 Jul 2023
Mixture of Attention Heads: Selecting Attention Heads Per Token
Xiaofeng Zhang
Songlin Yang
Zeyu Huang
Jie Zhou
Wenge Rong
Zhang Xiong
MoE
99
42
0
11 Oct 2022
Comparing Rewinding and Fine-tuning in Neural Network Pruning
Alex Renda
Jonathan Frankle
Michael Carbin
224
383
0
05 Mar 2020
Deep Elastic Networks with Model Selection for Multi-Task Learning
Chanho Ahn
Eunwoo Kim
Songhwai Oh
49
49
0
11 Sep 2019
1