Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2411.01245
Cited By
PMoL: Parameter Efficient MoE for Preference Mixing of LLM Alignment
2 November 2024
Dongxu Liu
Bing Xu
Yinzhuo Chen
Bufan Xu
Wenpeng Lu
Muyun Yang
T. Zhao
MoE
Re-assign community
ArXiv
PDF
HTML
Papers citing
"PMoL: Parameter Efficient MoE for Preference Mixing of LLM Alignment"
1 / 1 papers shown
Title
LEO-MINI: An Efficient Multimodal Large Language Model using Conditional Token Reduction and Mixture of Multi-Modal Experts
Yimu Wang
Mozhgan Nasr Azadani
Sean Sedwards
Krzysztof Czarnecki
MLLM
MoE
52
0
0
07 Apr 2025
1