Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2403.09611
Cited By
MM1: Methods, Analysis & Insights from Multimodal LLM Pre-training
14 March 2024
Brandon McKinzie
Zhe Gan
J. Fauconnier
Sam Dodge
Bowen Zhang
Philipp Dufter
Dhruti Shah
Xianzhi Du
Futang Peng
Floris Weers
Anton Belyi
Haotian Zhang
Karanjeet Singh
Doug Kang
Ankur Jain
Hongyu He
Max Schwarzer
Tom Gunter
Xiang Kong
Aonan Zhang
Jianyu Wang
Chong-Jun Wang
Nan Du
Tao Lei
Sam Wiseman
Guoli Yin
Mark Lee
Zirui Wang
Ruoming Pang
Peter Grasch
Alexander Toshev
Yinfei Yang
MLLM
Re-assign community
ArXiv
PDF
HTML
Papers citing
"MM1: Methods, Analysis & Insights from Multimodal LLM Pre-training"
2 / 52 papers shown
Title
Conceptual 12M: Pushing Web-Scale Image-Text Pre-Training To Recognize Long-Tail Visual Concepts
Soravit Changpinyo
P. Sharma
Nan Ding
Radu Soricut
VLM
278
1,082
0
17 Feb 2021
Megatron-LM: Training Multi-Billion Parameter Language Models Using Model Parallelism
M. Shoeybi
M. Patwary
Raul Puri
P. LeGresley
Jared Casper
Bryan Catanzaro
MoE
245
1,821
0
17 Sep 2019
Previous
1
2