ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2407.05463
  4. Cited By
Training Task Experts through Retrieval Based Distillation

Training Task Experts through Retrieval Based Distillation

7 July 2024
Jiaxin Ge
Xueying Jia
Vijay Viswanathan
Hongyin Luo
Graham Neubig
ArXiv (abs)PDFHTML

Papers citing "Training Task Experts through Retrieval Based Distillation"

3 / 3 papers shown
Title
CoPS: Empowering LLM Agents with Provable Cross-Task Experience Sharing
CoPS: Empowering LLM Agents with Provable Cross-Task Experience Sharing
Chen Yang
Chenyang Zhao
Q. Gu
Dongruo Zhou
LRM
70
0
0
22 Oct 2024
SELF-GUIDE: Better Task-Specific Instruction Following via
  Self-Synthetic Finetuning
SELF-GUIDE: Better Task-Specific Instruction Following via Self-Synthetic Finetuning
Chenyang Zhao
Xueying Jia
Vijay Viswanathan
Tongshuang Wu
Graham Neubig
SyDaALM
96
28
0
16 Jul 2024
In-Context Learning with Long-Context Models: An In-Depth Exploration
In-Context Learning with Long-Context Models: An In-Depth Exploration
Amanda Bertsch
Maor Ivgi
Uri Alon
Jonathan Berant
Matthew R. Gormley
Matthew R. Gormley
Graham Neubig
ReLMAIMat
187
80
0
30 Apr 2024
1