ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2407.16245
  4. Cited By
Exploring the Effectiveness and Consistency of Task Selection in
  Intermediate-Task Transfer Learning

Exploring the Effectiveness and Consistency of Task Selection in Intermediate-Task Transfer Learning

23 July 2024
Pin-Jie Lin
Miaoran Zhang
Marius Mosbach
Dietrich Klakow
ArXivPDFHTML

Papers citing "Exploring the Effectiveness and Consistency of Task Selection in Intermediate-Task Transfer Learning"

5 / 5 papers shown
Title
ATTEMPT: Parameter-Efficient Multi-task Tuning via Attentional Mixtures
  of Soft Prompts
ATTEMPT: Parameter-Efficient Multi-task Tuning via Attentional Mixtures of Soft Prompts
Akari Asai
Mohammadreza Salehi
Matthew E. Peters
Hannaneh Hajishirzi
124
100
0
24 May 2022
SPoT: Better Frozen Model Adaptation through Soft Prompt Transfer
SPoT: Better Frozen Model Adaptation through Soft Prompt Transfer
Tu Vu
Brian Lester
Noah Constant
Rami Al-Rfou
Daniel Cer
VLM
LRM
137
277
0
15 Oct 2021
P-Tuning v2: Prompt Tuning Can Be Comparable to Fine-tuning Universally
  Across Scales and Tasks
P-Tuning v2: Prompt Tuning Can Be Comparable to Fine-tuning Universally Across Scales and Tasks
Xiao Liu
Kaixuan Ji
Yicheng Fu
Weng Lam Tam
Zhengxiao Du
Zhilin Yang
Jie Tang
VLM
238
806
0
14 Oct 2021
The Power of Scale for Parameter-Efficient Prompt Tuning
The Power of Scale for Parameter-Efficient Prompt Tuning
Brian Lester
Rami Al-Rfou
Noah Constant
VPVLM
280
3,848
0
18 Apr 2021
Which Model to Transfer? Finding the Needle in the Growing Haystack
Which Model to Transfer? Finding the Needle in the Growing Haystack
Cédric Renggli
André Susano Pinto
Luka Rimanic
J. Puigcerver
C. Riquelme
Ce Zhang
Mario Lucic
26
23
0
13 Oct 2020
1