ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2505.19245
51
0

To CoT or To Loop? A Formal Comparison Between Chain-of-Thought and Looped Transformers

25 May 2025
Kevin Xu
Issei Sato
Author Contacts:
kevinxu@g.ecc.u-tokyo.ac.jpsato@g.ecc.u-tokyo.ac.jp
    LRM
ArXiv (abs)PDFHTML
Main:10 Pages
2 Figures
Bibliography:2 Pages
1 Tables
Appendix:12 Pages
Abstract

Chain-of-Thought (CoT) and Looped Transformers have been shown to empirically improve performance on reasoning tasks and to theoretically enhance expressivity by recursively increasing the number of computational steps. However, their comparative capabilities are still not well understood. In this paper, we provide a formal analysis of their respective strengths and limitations. We show that Looped Transformers can efficiently simulate parallel computations for deterministic tasks, which we formalize as evaluation over directed acyclic graphs. In contrast, CoT with stochastic decoding excels at approximate inference for compositional structures, namely self-reducible problems. These separations suggest the tasks for which depth-driven recursion is more suitable, thereby offering practical cues for choosing between reasoning paradigms.

View on arXiv
@article{xu2025_2505.19245,
  title={ To CoT or To Loop? A Formal Comparison Between Chain-of-Thought and Looped Transformers },
  author={ Kevin Xu and Issei Sato },
  journal={arXiv preprint arXiv:2505.19245},
  year={ 2025 }
}
Comments on this paper