Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2409.13183
Cited By
SKIntern
\textit{SKIntern}
SKIntern
: Internalizing Symbolic Knowledge for Distilling Better CoT Capabilities into Small Language Models
20 September 2024
Huanxuan Liao
Shizhu He
Yupu Hao
Xiang Li
Yuanzhe Zhang
Kang Liu
Jun Zhao
LRM
Re-assign community
ArXiv
PDF
HTML
Papers citing
"$\textit{SKIntern}$: Internalizing Symbolic Knowledge for Distilling Better CoT Capabilities into Small Language Models"
5 / 5 papers shown
Title
Efficient Reasoning for LLMs through Speculative Chain-of-Thought
Jikai Wang
J. Li
Lijun Wu
M. Zhang
LLMAG
LRM
69
2
0
27 Apr 2025
CoT-RAG: Integrating Chain of Thought and Retrieval-Augmented Generation to Enhance Reasoning in Large Language Models
Feiyang Li
Peng Fang
Zhan Shi
Arijit Khan
Fang Wang
Dan Feng
Weihao Wang
Xin Zhang
Yongjian Cui
ReLM
LRM
43
1
0
18 Apr 2025
Efficient Reasoning Models: A Survey
Sicheng Feng
Gongfan Fang
Xinyin Ma
Xinchao Wang
ReLM
LRM
145
2
0
15 Apr 2025
Stop Overthinking: A Survey on Efficient Reasoning for Large Language Models
Yang Sui
Yu-Neng Chuang
Guanchu Wang
Jiamu Zhang
Tianyi Zhang
...
Hongyi Liu
Andrew Wen
Shaochen
Zhong
Hanjie Chen
OffRL
ReLM
LRM
77
31
0
20 Mar 2025
Neural-Symbolic Collaborative Distillation: Advancing Small Language Models for Complex Reasoning Tasks
Huanxuan Liao
Shizhu He
Yao Xu
Yuanzhe Zhang
Kang Liu
Jun Zhao
LRM
53
3
0
20 Sep 2024
1