ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2406.14511
  4. Cited By
Investigating Mysteries of CoT-Augmented Distillation

Investigating Mysteries of CoT-Augmented Distillation

20 June 2024
Somin Wadhwa
Silvio Amir
Byron C. Wallace
    ReLM
    LRM
ArXivPDFHTML

Papers citing "Investigating Mysteries of CoT-Augmented Distillation"

5 / 5 papers shown
Title
Who Taught You That? Tracing Teachers in Model Distillation
Who Taught You That? Tracing Teachers in Model Distillation
Somin Wadhwa
Chantal Shaib
Silvio Amir
Byron C. Wallace
74
1
0
10 Feb 2025
Boosting LLM Translation Skills without General Ability Loss via
  Rationale Distillation
Boosting LLM Translation Skills without General Ability Loss via Rationale Distillation
Junhong Wu
Yang Zhao
Yangyifan Xu
Bing Liu
Chengqing Zong
CLL
40
1
0
17 Oct 2024
Preemptive Answer "Attacks" on Chain-of-Thought Reasoning
Preemptive Answer "Attacks" on Chain-of-Thought Reasoning
Rongwu Xu
Zehan Qi
Wei Xu
LRM
SILM
64
6
0
31 May 2024
Distilling Step-by-Step! Outperforming Larger Language Models with Less
  Training Data and Smaller Model Sizes
Distilling Step-by-Step! Outperforming Larger Language Models with Less Training Data and Smaller Model Sizes
Lokesh Nagalapatti
Chun-Liang Li
Chih-Kuan Yeh
Hootan Nakhost
Yasuhisa Fujii
Alexander Ratner
Ranjay Krishna
Chen-Yu Lee
Tomas Pfister
ALM
220
502
0
03 May 2023
SCOTT: Self-Consistent Chain-of-Thought Distillation
SCOTT: Self-Consistent Chain-of-Thought Distillation
Jamie Yap
Zhengyang Wang
Zheng Li
K. Lynch
Bing Yin
Xiang Ren
LRM
64
93
0
03 May 2023
1