ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2505.04955
39
0

Chain-of-Thought Tokens are Computer Program Variables

8 May 2025
Fangwei Zhu
Peiyi Wang
Zhifang Sui
    LRM
ArXivPDFHTML
Abstract

Chain-of-thoughts (CoT) requires large language models (LLMs) to generate intermediate steps before reaching the final answer, and has been proven effective to help LLMs solve complex reasoning tasks. However, the inner mechanism of CoT still remains largely unclear. In this paper, we empirically study the role of CoT tokens in LLMs on two compositional tasks: multi-digit multiplication and dynamic programming. While CoT is essential for solving these problems, we find that preserving only tokens that store intermediate results would achieve comparable performance. Furthermore, we observe that storing intermediate results in an alternative latent form will not affect model performance. We also randomly intervene some values in CoT, and notice that subsequent CoT tokens and the final answer would change correspondingly. These findings suggest that CoT tokens may function like variables in computer programs but with potential drawbacks like unintended shortcuts and computational complexity limits between tokens. The code and data are available atthis https URL.

View on arXiv
@article{zhu2025_2505.04955,
  title={ Chain-of-Thought Tokens are Computer Program Variables },
  author={ Fangwei Zhu and Peiyi Wang and Zhifang Sui },
  journal={arXiv preprint arXiv:2505.04955},
  year={ 2025 }
}
Comments on this paper