ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2409.19272
68
1

Perception Compressor: A Training-Free Prompt Compression Framework in Long Context Scenarios

28 September 2024
Jiwei Tang
Jin Xu
Tingwei Lu
Hai Lin
Yiming Zhao
Lin Hai
Hai-Tao Zheng
    VLM
ArXivPDFHTML
Abstract

Large language models (LLMs) demonstrate exceptional capabilities in various scenarios. However, they suffer from much redundant information and are sensitive to the position of key information in long context scenarios. To address these challenges, we present Perception Compressor, a training-free prompt compression framework. It includes a perception retriever that leverages guiding questions and instruction to retrieve the most relevant demonstrations, a dual-slope ratio allocator to dynamically allocate compression ratios and open-book ratios, and a semi-guided iterative compression that retains key information at the token level while removing tokens that distract the LLM. We conduct extensive experiments on long context benchmarks, i.e., NaturalQuestions, LongBench, and MuSiQue. Experiment results show that Perception Compressor outperforms existing methods by a large margin, achieving state-of-the-art performance.

View on arXiv
@article{tang2025_2409.19272,
  title={ Perception Compressor: A Training-Free Prompt Compression Framework in Long Context Scenarios },
  author={ Jiwei Tang and Jin Xu and Tingwei Lu and Zhicheng Zhang and Yiming Zhao and Lin Hai and Hai-Tao Zheng },
  journal={arXiv preprint arXiv:2409.19272},
  year={ 2025 }
}
Comments on this paper