ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2403.09611
  4. Cited By
MM1: Methods, Analysis & Insights from Multimodal LLM Pre-training

MM1: Methods, Analysis & Insights from Multimodal LLM Pre-training

14 March 2024
Brandon McKinzie
Zhe Gan
J. Fauconnier
Sam Dodge
Bowen Zhang
Philipp Dufter
Dhruti Shah
Xianzhi Du
Futang Peng
Floris Weers
Anton Belyi
Haotian Zhang
Karanjeet Singh
Doug Kang
Ankur Jain
Hongyu He
Max Schwarzer
Tom Gunter
Xiang Kong
Aonan Zhang
Jianyu Wang
Chong-Jun Wang
Nan Du
Tao Lei
Sam Wiseman
Guoli Yin
Mark Lee
Zirui Wang
Ruoming Pang
Peter Grasch
Alexander Toshev
Yinfei Yang
    MLLM
ArXivPDFHTML

Papers citing "MM1: Methods, Analysis & Insights from Multimodal LLM Pre-training"

2 / 52 papers shown
Title
Conceptual 12M: Pushing Web-Scale Image-Text Pre-Training To Recognize
  Long-Tail Visual Concepts
Conceptual 12M: Pushing Web-Scale Image-Text Pre-Training To Recognize Long-Tail Visual Concepts
Soravit Changpinyo
P. Sharma
Nan Ding
Radu Soricut
VLM
278
1,082
0
17 Feb 2021
Megatron-LM: Training Multi-Billion Parameter Language Models Using
  Model Parallelism
Megatron-LM: Training Multi-Billion Parameter Language Models Using Model Parallelism
M. Shoeybi
M. Patwary
Raul Puri
P. LeGresley
Jared Casper
Bryan Catanzaro
MoE
245
1,821
0
17 Sep 2019
Previous
12