ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2410.04691
  4. Cited By
Deeper Insights Without Updates: The Power of In-Context Learning Over
  Fine-Tuning

Deeper Insights Without Updates: The Power of In-Context Learning Over Fine-Tuning

7 October 2024
Qingyu Yin
Xuzheng He
Luoao Deng
Chak Tou Leong
Fan Wang
Yanzhao Yan
Xiaoyu Shen
Qiang Zhang
ArXivPDFHTML

Papers citing "Deeper Insights Without Updates: The Power of In-Context Learning Over Fine-Tuning"

2 / 2 papers shown
Title
Induction Head Toxicity Mechanistically Explains Repetition Curse in Large Language Models
Induction Head Toxicity Mechanistically Explains Repetition Curse in Large Language Models
Shuxun Wang
Qingyu Yin
Chak Tou Leong
Qiang Zhang
Linyi Yang
2
0
0
17 May 2025
RICo: Refined In-Context Contribution for Automatic Instruction-Tuning Data Selection
RICo: Refined In-Context Contribution for Automatic Instruction-Tuning Data Selection
Yixin Yang
Qingxiu Dong
Linli Yao
Fangwei Zhu
Zhifang Sui
48
0
0
08 May 2025
1