ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2312.15960
52
14
v1v2v3v4v5 (latest)

MoTCoder: Elevating Large Language Models with Modular of Thought for Challenging Programming Tasks

26 December 2023
Jingyao Li
Pengguang Chen
Jiaya Jia
Hong Xu
Jiaya Jia
    LRM
ArXiv (abs)PDFHTML
Abstract

Large Language Models (LLMs) have showcased impressive capabilities in handling straightforward programming tasks. However, their performance tends to falter when confronted with more challenging programming problems. We observe that conventional models often generate solutions as monolithic code blocks, restricting their effectiveness in tackling intricate questions. To overcome this limitation, we present Modular-of-Thought Coder (MoTCoder). We introduce a pioneering framework for MoT instruction tuning, designed to promote the decomposition of tasks into logical sub-tasks and sub-modules. Our investigations reveal that, through the cultivation and utilization of sub-modules, MoTCoder significantly improves both the modularity and correctness of the generated solutions, leading to substantial relative pass@1 improvements of 12.9% on APPS and 9.43% on CodeContests. Our codes are available at this https URL.

View on arXiv
@article{li2025_2312.15960,
  title={ MoTCoder: Elevating Large Language Models with Modular of Thought for Challenging Programming Tasks },
  author={ Jingyao Li and Pengguang Chen and Bin Xia and Hong Xu and Jiaya Jia },
  journal={arXiv preprint arXiv:2312.15960},
  year={ 2025 }
}
Comments on this paper