2
0

NeuroGen: Neural Network Parameter Generation via Large Language Models

Abstract

Acquiring the parameters of neural networks (NNs) has been one of the most important problems in machine learning since the inception of NNs. Traditional approaches, such as backpropagation and forward-only optimization, acquire parameters via iterative data fitting to gradually optimize them. This paper aims to explore the feasibility of a new direction: acquiring NN parameters via large language model generation. We propose NeuroGen, a generalized and easy-to-implement two-stage approach for NN parameter generation conditioned on descriptions of the data, task, and network architecture. Stage one is Parameter Reference Knowledge Injection, where LLMs are pretrained on NN checkpoints to build foundational understanding of parameter space, whereas stage two is Context-Enhanced Instruction Tuning, enabling LLMs to adapt to specific tasks through enriched, task-aware prompts. Experimental results demonstrate that NeuroGen effectively generates usable NN parameters. Our findings highlight the feasibility of LLM-based NN parameter generation and suggest a promising new paradigm where LLMs and lightweight NNs can coexist synergistically

View on arXiv
@article{wang2025_2505.12470,
  title={ NeuroGen: Neural Network Parameter Generation via Large Language Models },
  author={ Jiaqi Wang and Yusen Zhang and Xi Li },
  journal={arXiv preprint arXiv:2505.12470},
  year={ 2025 }
}
Comments on this paper