CP-Router: An Uncertainty-Aware Router Between LLM and LRM

Recent advances in Large Reasoning Models (LRMs) have significantly improved long-chain reasoning capabilities over Large Language Models (LLMs). However, LRMs often produce unnecessarily lengthy outputs even for simple queries, leading to inefficiencies or even accuracy degradation compared to LLMs. To overcome this, we propose CP-Router, a training-free and model-agnostic routing framework that dynamically selects between an LLM and an LRM, demonstrated with multiple-choice question answering (MCQA) prompts. The routing decision is guided by the prediction uncertainty estimates derived via Conformal Prediction (CP), which provides rigorous coverage guarantees. To further refine the uncertainty differentiation across inputs, we introduce Full and Binary Entropy (FBE), a novel entropy-based criterion that adaptively selects the appropriate CP threshold. Experiments across diverse MCQA benchmarks, including mathematics, logical reasoning, and Chinese chemistry, demonstrate that CP-Router efficiently reduces token usage while maintaining or even improving accuracy compared to using LRM alone. We also extend CP-Router to diverse model pairings and open-ended QA, where it continues to demonstrate strong performance, validating its generality and robustness.
View on arXiv@article{su2025_2505.19970, title={ CP-Router: An Uncertainty-Aware Router Between LLM and LRM }, author={ Jiayuan Su and Fulin Lin and Zhaopeng Feng and Han Zheng and Teng Wang and Zhenyu Xiao and Xinlong Zhao and Zuozhu Liu and Lu Cheng and Hongwei Wang }, journal={arXiv preprint arXiv:2505.19970}, year={ 2025 } }