ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2505.23191
37
1

ExpeTrans: LLMs Are Experiential Transfer Learners

29 May 2025
Jinglong Gao
Xiao Ding
Lingxiao Zou
Bibo Cai
Bing Qin
Ting Liu
ArXiv (abs)PDFHTML
Main:9 Pages
8 Figures
Bibliography:3 Pages
11 Tables
Appendix:28 Pages
Abstract

Recent studies provide large language models (LLMs) with textual task-solving experiences via prompts to improve their performance. However, previous methods rely on substantial human labor or time to gather such experiences for each task, which is impractical given the growing variety of task types in user queries to LLMs. To address this issue, we design an autonomous experience transfer framework to explore whether LLMs can mimic human cognitive intelligence to autonomously transfer experience from existing source tasks to newly encountered target tasks. This not only allows the acquisition of experience without extensive costs of previous methods, but also offers a novel path for the generalization of LLMs. Experimental results on 13 datasets demonstrate that our framework effectively improves the performance of LLMs. Furthermore, we provide a detailed analysis of each module in the framework.

View on arXiv
@article{gao2025_2505.23191,
  title={ ExpeTrans: LLMs Are Experiential Transfer Learners },
  author={ Jinglong Gao and Xiao Ding and Lingxiao Zou and Bibo Cai and Bing Qin and Ting Liu },
  journal={arXiv preprint arXiv:2505.23191},
  year={ 2025 }
}
Comments on this paper