ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2410.02200
  4. Cited By
Revisiting Prefix-tuning: Statistical Benefits of Reparameterization among Prompts

Revisiting Prefix-tuning: Statistical Benefits of Reparameterization among Prompts

3 October 2024
Minh Le
Chau Nguyen
Huy Nguyen
Quyen Tran
Trung Le
Nhat Ho
ArXivPDFHTML

Papers citing "Revisiting Prefix-tuning: Statistical Benefits of Reparameterization among Prompts"

1 / 1 papers shown
Title
Parallel Scaling Law for Language Models
Parallel Scaling Law for Language Models
Mouxiang Chen
Binyuan Hui
Zeyu Cui
Jiaxi Yang
Dayiheng Liu
Jianling Sun
Junyang Lin
Zhongxin Liu
MoE
LRM
35
0
0
15 May 2025
1