16

Innovator-VL: A Multimodal Large Language Model for Scientific Discovery

Zichen Wen
Boxue Yang
Shuang Chen
Yaojie Zhang
Yuhang Han
Junlong Ke
Cong Wang
Yicheng Fu
Jiawang Zhao
Jiangchao Yao
Xi Fang
Zhen Wang
Henxing Cai
Lin Yao
Zhifeng Gao
Yanhui Hong
Nang Yuan
Yixuan Li
Guojiang Zhao
Haoyi Tao
Nan Wang
Han Lyu
Guolin Ke
Ning Liao
Xiaoxing Wang
Kai Chen
Zhiyu Li
Feiyu Xiong
Sihan Hu
Kun Chen
Yanfeng Wang
Weinan E
Linfeng Zhang
Linfeng Zhang
Main:17 Pages
5 Figures
Bibliography:2 Pages
1 Tables
Appendix:38 Pages
Abstract

We present Innovator-VL, a scientific multimodal large language model designed to advance understanding and reasoning across diverse scientific domains while maintaining excellent performance on general vision tasks. Contrary to the trend of relying on massive domain-specific pretraining and opaque pipelines, our work demonstrates that principled training design and transparent methodology can yield strong scientific intelligence with substantially reduced data requirements. (i) First, we provide a fully transparent, end-to-end reproducible training pipeline, covering data collection, cleaning, preprocessing, supervised fine-tuning, reinforcement learning, and evaluation, along with detailed optimization recipes. This facilitates systematic extension by the community. (ii) Second, Innovator-VL exhibits remarkable data efficiency, achieving competitive performance on various scientific tasks using fewer than five million curated samples without large-scale pretraining. These results highlight that effective reasoning can be achieved through principled data selection rather than indiscriminate scaling. (iii) Third, Innovator-VL demonstrates strong generalization, achieving competitive performance on general vision, multimodal reasoning, and scientific benchmarks. This indicates that scientific alignment can be integrated into a unified model without compromising general-purpose capabilities. Our practices suggest that efficient, reproducible, and high-performing scientific multimodal models can be built even without large-scale data, providing a practical foundation for future research.

View on arXiv
Comments on this paper