Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2402.05526
Cited By
Buffer Overflow in Mixture of Experts
8 February 2024
Jamie Hayes
Ilia Shumailov
Itay Yona
MoE
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Buffer Overflow in Mixture of Experts"
2 / 2 papers shown
Title
BadMoE: Backdooring Mixture-of-Experts LLMs via Optimizing Routing Triggers and Infecting Dormant Experts
Qingyue Wang
Qi Pang
Xixun Lin
Shuai Wang
Daoyuan Wu
MoE
64
0
0
24 Apr 2025
LLMmap: Fingerprinting For Large Language Models
Dario Pasquini
Evgenios M. Kornaropoulos
G. Ateniese
58
6
0
22 Jul 2024
1