ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2502.12845
62
0

MOLLM: Multi-Objective Large Language Model for Molecular Design -- Optimizing with Experts

18 February 2025
Nian Ran
Yue Wang
Richard Allmendinger
ArXivPDFHTML
Abstract

Molecular design plays a critical role in advancing fields such as drug discovery, materials science, and chemical engineering. This work introduces the Multi-Objective Large Language Model for Molecular Design (MOLLM), a novel framework that combines domain-specific knowledge with the adaptability of Large Language Models to optimize molecular properties across multiple objectives. Leveraging in-context learning and multi-objective optimization, MOLLM achieves superior efficiency, innovation, and performance, significantly surpassing state-of-the-art (SOTA) methods. Recognizing the substantial impact of initial populations on evolutionary algorithms, we categorize them into three types: best initial, worst initial, and random initial, to ensure the initial molecules are the same for each method across experiments. Our results demonstrate that MOLLM consistently outperforms SOTA models in all of our experiments. We also provide extensive ablation studies to evaluate the superiority of our components.

View on arXiv
@article{ran2025_2502.12845,
  title={ MOLLM: Multi-Objective Large Language Model for Molecular Design -- Optimizing with Experts },
  author={ Nian Ran and Yue Wang and Richard Allmendinger },
  journal={arXiv preprint arXiv:2502.12845},
  year={ 2025 }
}
Comments on this paper