ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2402.18700
  4. Cited By
Learning to Compress Prompt in Natural Language Formats

Learning to Compress Prompt in Natural Language Formats

28 February 2024
Yu-Neng Chuang
Tianwei Xing
Chia-Yuan Chang
Zirui Liu
Xun Chen
Xia Hu
ArXivPDFHTML

Papers citing "Learning to Compress Prompt in Natural Language Formats"

4 / 4 papers shown
Title
Lost in the Passage: Passage-level In-context Learning Does Not Necessarily Need a "Passage"
Lost in the Passage: Passage-level In-context Learning Does Not Necessarily Need a "Passage"
Hao Sun
Chenming Tang
Gengyang Li
Yunfang Wu
AIMat
47
0
0
15 Feb 2025
From Reading to Compressing: Exploring the Multi-document Reader for Prompt Compression
From Reading to Compressing: Exploring the Multi-document Reader for Prompt Compression
Eunseong Choi
Sunkyung Lee
Minjin Choi
June Park
Jongwuk Lee
65
1
0
03 Jan 2025
KV Cache Compression, But What Must We Give in Return? A Comprehensive
  Benchmark of Long Context Capable Approaches
KV Cache Compression, But What Must We Give in Return? A Comprehensive Benchmark of Long Context Capable Approaches
Jiayi Yuan
Hongyi Liu
Shaochen
Zhong
Yu-Neng Chuang
...
Hongye Jin
V. Chaudhary
Zhaozhuo Xu
Zirui Liu
Xia Hu
46
18
0
01 Jul 2024
Large Language Models are Zero-Shot Reasoners
Large Language Models are Zero-Shot Reasoners
Takeshi Kojima
S. Gu
Machel Reid
Yutaka Matsuo
Yusuke Iwasawa
ReLM
LRM
328
4,077
0
24 May 2022
1