ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2503.22424
41
0

CoSIL: Software Issue Localization via LLM-Driven Code Repository Graph Searching

28 March 2025
Zhonghao Jiang
Xiaoxue Ren
Meng Yan
Wei Jiang
Yongbin Li
Ziwei Liu
ArXivPDFHTML
Abstract

Large language models (LLMs) have significantly advanced autonomous software engineering, leading to a growing number of software engineering agents that assist developers in automatic program repair. Issue localization forms the basis for accurate patch generation. However, because of limitations caused by the context window length of LLMs, existing issue localization methods face challenges in balancing concise yet effective contexts and adequately comprehensive search spaces. In this paper, we introduce CoSIL, an LLM driven, simple yet powerful function level issue localization method without training or indexing. CoSIL reduces the search space through module call graphs, iteratively searches the function call graph to obtain relevant contexts, and uses context pruning to control the search direction and manage contexts effectively. Importantly, the call graph is dynamically constructed by the LLM during search, eliminating the need for pre-parsing. Experiment results demonstrate that CoSIL achieves a Top-1 localization success rate of 43 percent and 44.6 percent on SWE bench Lite and SWE bench Verified, respectively, using Qwen2.5 Coder 32B, outperforming existing methods by 8.6 to 98.2 percent. When CoSIL is applied to guide the patch generation stage, the resolved rate further improves by 9.3 to 31.5 percent.

View on arXiv
@article{jiang2025_2503.22424,
  title={ CoSIL: Software Issue Localization via LLM-Driven Code Repository Graph Searching },
  author={ Zhonghao Jiang and Xiaoxue Ren and Meng Yan and Wei Jiang and Yong Li and Zhongxin Liu },
  journal={arXiv preprint arXiv:2503.22424},
  year={ 2025 }
}
Comments on this paper