GraphThought: Graph Combinatorial Optimization with Thought Generation
- LRMAI4CE

Graph combinatorial optimization (GCO) problems are central to domains like logistics and bioinformatics. While traditional solvers dominate, large language models (LLMs) offer new possibilities for structured reasoning, yet struggle with complex GCO tasks requiring rigorous combinatorial analysis and multi-step deduction, often producing hallucinated steps. We first formalize the Optimal Thoughts Design (OTD) problem, which provides a structured guidance for producing high-quality intermediate reasoning steps. Building on this formulation, we introduce GraphThought, a novel framework that generates effective reasoning sequences through either heuristic-guided forward search or solver-aligned backward reasoning. By fine-tuning LLMs on these structured thought sequences, we develop Llama-GT, an 8B-parameter model that achieves state-of-the-art performance on the GraphArena benchmark, outperforming significantly larger models like DeepSeek-V3. Our results demonstrate that when scaffolded with structured reasoning priors, principled thought generation can significantly enhance LLM performance on GCO tasks without requiring increased model scale.
View on arXiv@article{huang2025_2502.11607, title={ GraphThought: Graph Combinatorial Optimization with Thought Generation }, author={ Zixiao Huang and Lifeng Guo and Wenhao Li and Junjie Sheng and Chuyun Shen and Haosheng Chen and Bo Jin and Changhong Lu and Xiangfeng Wang }, journal={arXiv preprint arXiv:2502.11607}, year={ 2025 } }